US3725563A - Method of perspective transformation in scanned raster visual display - Google Patents

Method of perspective transformation in scanned raster visual display Download PDF

Info

Publication number
US3725563A
US3725563A US00211372A US3725563DA US3725563A US 3725563 A US3725563 A US 3725563A US 00211372 A US00211372 A US 00211372A US 3725563D A US3725563D A US 3725563DA US 3725563 A US3725563 A US 3725563A
Authority
US
United States
Prior art keywords
image
spot
display
determining
axes
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime
Application number
US00211372A
Inventor
B Woycechowsky
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Singer Co
CAE Link Corp
Original Assignee
Singer Co
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Singer Co filed Critical Singer Co
Application granted granted Critical
Publication of US3725563A publication Critical patent/US3725563A/en
Assigned to LINK FLIGHT SIMULATION CORPORATION, KIRKWOOD INDUSTRIAL PARK, BINGHAMTON, NY 13902-1237, A DE CORP. reassignment LINK FLIGHT SIMULATION CORPORATION, KIRKWOOD INDUSTRIAL PARK, BINGHAMTON, NY 13902-1237, A DE CORP. ASSIGNMENT OF ASSIGNORS INTEREST. Assignors: SINGER COMPANY, THE, A NJ CORP.
Assigned to CAE-LINK CORPORATION, A CORP. OF DE. reassignment CAE-LINK CORPORATION, A CORP. OF DE. MERGER (SEE DOCUMENT FOR DETAILS). DECEMBER 1, 1988, DELAWARE Assignors: CAE-LIN CORPORATION, A CORP. OF DE (CHANGED TO), LINK FACTICAL MILITARY SIMULATION CORPORATION, A CORP. OF DE, LINK FLIGHT SIMULATION CORPORATION, A DE CORP., LINK TRAINING SERVICES CORPORATION, A CORP. OF DE (MERGED INTO)
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes
    • G09B9/02Simulators for teaching or training purposes for teaching control of vehicles or other craft
    • G09B9/08Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of aircraft, e.g. Link trainer
    • G09B9/30Simulation of view from aircraft
    • G09B9/301Simulation of view from aircraft by computer-processed or -generated image
    • G09B9/302Simulation of view from aircraft by computer-processed or -generated image the image being transformed by computer processing, e.g. updating the image to correspond to the changing point of view

Definitions

  • a general method of providing perspective transfor- I mations in a visual display system having an image ⁇ 63] commumomn'pan of generated by a scanned raster device such as a CRT or 1971 abaldoned television projector is shown.
  • the television display is a window out of which an observer views a simulated Cl 5 35/12 35, picture of terrain.
  • the line of sight from the observer passing through the instantaneous spot position on the [51] Int. Cl ..G09b 9/08, H04n 3/30, G015 7/20 window is used to find a ground intersection point, the
  • PATENTEDAPR3 197a SHEET U20F 1o PATENTEUAPRB 197a SHEET 03 HF 10 ATTORNEY PATENTEnAPRs I975 3 725,553
  • Visual systems for use in aircraft simulators and other types of trainers have gained widespread use due to the increased cost of training in an actual aircraft or other vehicle or device.
  • four basic types of visual systems have been used, one of which is a camera model system in which a probe containing a TV camera is moved over a scale terrain model in accordance with computed attitude and position of the simulator. The resulting image is displayed to the trainee with a TV projector or CRT.
  • a second type system is the film-based system in which a predetermined path is flown by an aircraft and a motion picture recorded. The motion picture is then shown to the trainee as he flies the same path. Deviations may be simulated by optical distortion as is shown in patents granted to H. S. I-Iemstreet such as U.S. Pat. No. 3,233,508 granted on Feb. 8, 1966 and U.S. Pat. No. 3,261,912 granted on July 19, 1966. Also disclosed therein is a variation of the system of the present invention in which the film image is viewed by a TV camera and the resulting image projected via TV projector or CRT. Distortion in that case is accomplished by raster shaping.
  • a third type of system is a scan-transparency system wherein an image is generated by scanning a transparency containing othophotographic information. The information generated is displayed via TV as in the previous example.
  • Such a system is shown in U.S. Pat. No. 3,439,105, granted to W. C. Ebeling et al. on Apr. 15, 1969.
  • a fourth type system is a digital image generation system.
  • image information is stored in a computer which selects the desired information for display in a TV type display.
  • a variation of the film based system viewed by a TV camera is a film based system scanned by a flying spot scanner to generate an image.
  • the film based system and the scan transparency system have in common an important aspect.
  • the recorded information on them is from a specific viewpoint.
  • To produce a scene as it would appear if viewed from another viewpoint requires raster shaping.
  • the camera model and digital systems do not have this restriction there may be cases where it is desired to cause a change in viewpoint by raster shaping rather than moving the camera probe or reconstructing the digital image. For example, in the former case problems arise as the probe gets close to the model. In the later, construction of images uses considerable computer time.
  • the present invention provides a system which may be used for raster shaping in any visual system where it is desired to transform an image containing information as viewed from one viewpoint to an image which appears as if viewed from another viewpoint.
  • the invention accordingly comprises the several steps and the relation of one or more of such steps with respect to each of the others, and the apparatus embodying features of construction, combination(s) of elements and arrangement of parts which are adapted to effect such steps, all as exemplified in the following detailed disclosure, and the scope of the invention will be indicated in the claims.
  • FIG. 1 is a block diagram of a preferred embodiment of the invention in combination with an aircraft simulator
  • FIG. 2 is a flow diagram of a preferred set of equations for use with the invention
  • FIG. 3 is a perspective view of the relationship between an observers view through the display and the view on the image source;
  • FIG. 4 is a schematic view of a first type of scanned raster device
  • FIG. 5 is a schematic view of a second type of scanned raster device
  • FIG. 6 is a block diagram of a preferred embodiment of a raster computer for implementing the equations of FIG. 2;
  • FIG. 7 is a block diagram of a modification to the embodiment of FIG. 6 for compensating for image roll in those systems where it is desirable to roll the image before raster shaping is introduced;
  • FIG. 8 is a block diagram showing a second form of the equations of FIG. 2;
  • FIG. 9 is a block diagram of the implementation of the equations of FIG. 8;
  • FIG. 10 is a block diagram of the equations of FIG. 8 in rate rather than position form.
  • FIG. 11 is a block diagram of a third form which the equations of FIG. 2 may take.
  • FIG. 1 is a basic representation of the systems with which the present invention may be used.
  • Block 11 is an image source. It may be an image recorded on a frame of film, the image picked up by a probe in a camera model system, a digitally generated image, an orthophoto or other image.
  • Block 13 is a scanned raster device. It may be a TV camera viewing image source 1 l or a flying spot scanner device scanning image source 1 1 to produce a video signal.
  • Display 15 may be one or more TV projectors, CRTs, laser projectors or other similar devices capable of projecting a video signal.
  • Raster computer 17 is the system of the present invention which shapes the raster of device 13 to obtain the desired perspective.
  • Sync generator 19 provides sync commands to synchronize the scans of raster device 13 and display 15.
  • the image presented by block 11 will represent a scene as it would appear from a predetermined viewpoint. If it is film it will be as viewed from the location of the taking camera; if a probe image it will depend on the probe position and altitude. Likewise, if an othophoto it will appear as a map view from a certain altitude and if digitally generated will represent a view based on computer inputs. In each case, however, the viewpoint of the image is known.
  • FIG. I Examining the balance of FIG. I will further show the problem to be solved by the present invention.
  • Display is in a position to be viewed by a trainee in simulator cockpit 21.
  • This cockpit will contain controls and instruments duplicating those of the actual aircraft being simulated.
  • Control movements will be supplied as inputs to computer 23 which will use these inputs in equations of motion to compute the aircraft state vector (position, attitude, velocity). From this computed data, outputs from computer 23 drive the instruments in cockpit 21 such as altimeter, airspeed, etc.
  • the state vector information of the aircraft is available in computer 23 and may be used along with the information concerning the viewpoint from which the image was made by raster computer 17 in determining proper raster shape.
  • This viewpoint information is contained in block 11 and is provided to raster computer 17 and/or simulator computer 23.
  • the information may, for example, be recorded on the film and picked up by a device in block 11 in a film based system.
  • a device in block 11 in a film based system.
  • the position and attitude of the probe will be available.
  • the scale of the orthophoto will be known; and in a computer generated image, the inputs used in constructing the image will be known.
  • computers 17 and 23 have available the state vector of the simulated aircraft and the state vector of the image present in image source 11. This information will of course be constantly updated as the simulator flies and as the image changes due to film advancement, probe movement, etc.
  • a third type of information is used in the present invention. This is the instantaneous position of the scanning spot on the display as referenced to the eyepoint of the trainee. This information is known indirectly through sync generator 19 which controls the scanning of the spot on display 15. For an explanation of how the display raster may be made quite accurate see US. application Ser. No. 130217 filed by R. F. H. McCoy et al. on Apr. 1, 1971 and assigned to the same assignee as the present invention.
  • FIG. 2 shows a flow diagram of the computations. From the state vector of the simulated aircraft, the rotation of the aircraft with respect to a horizontal frame of reference is known. These rotations are 0,, the simulated pitch angle; (1),, the simulated roll angle; and 111,, the simulated heading angle. From these angles, computer 23 of FIG. 1 may compute the sines and cosines of the angles; and from the sines and cosines, the direction cosines of the simulated aircraft body to ground reference axes. This computation is shown in block 25 of FIG. 2 and results in a matrix.
  • Computer 23 may also compute the direction cosines of the window axes referenced to the body axes, m from ill the window heading with respect to the body axes; 0 the window pitch with respect to the body axes; and da the window roll with respect to the body axes.
  • the computation required to evaluate the w corresponds to the a computation shown in block 25. If the window axes are fixed with respect to the body axes, the w are constant and therefore need not be continuously computed. The evaluation of the w is indicated in block 27.
  • the simulated eyepoint is located some distance away from the simulated center of gravity. In situations where the eyepoint displacement is significant (e.g., takeoff and landing situations for transport aircraft), the eyepoint displacement with respect to the center of gravity must be taken into account.
  • the components of eyepoint displacement with respect to the center of gravity are referenced to the horizontal frame of reference by multiplying the body axes coordinates of the eyepoint (x yggp, 2 by the 04,, matrix.
  • the evaluation of the horizontal frame components of the eyepoint with respect to the center of gravity (x y 2 is shown in block 29.
  • Eyepoint altitude with respect to the horizontal plane of reference, hay) is also computed in block 29 by subtracting z from the altitude of the simulated aircraft, h,
  • Horizontal frame of reference components of eyepoint position relative to image position are found by respectively adding x and y to the horizontal frame components of the simulated aircrafts center of gravity (x and y,,) and then subtracting the image position coordinates (x and y This computation is shown in block 31 of FIG. 2.
  • the remainder of the computations must be done in raster computer 17 of FIG. 1, which is an analog computer, due to the fact that computations are being done for an instantaneous spot position.
  • the angles 111 and (BWI representing the coordinates of the instantaneous spot position as viewed from the pilots eyepoint, are generated in a manner to be described later.
  • the window axes coordinates of the instantaneous spot position are 1, tan 111 tan (9w),..
  • point 37 is a fixed point on the ground which is the reference for x y and x,, y,.
  • the X and Y position of the simulated eyepoint 39 with respect to the image axes 41, Ax and Ay, are also shown.
  • Line 43 is the line passing through the instantaneous spot 45 on display face 47. Since the direction lines of line 43 and the eyepoint altitude have been obtained, it is now possible to find the horizontal components (h (1 /11 and It d /d of line 43. This computation is done in block 51 ofFIG. 2 where they are added to the horizontal components of the eyepoint with respect to the image position. The results of the computation done in block 51 of FIG.
  • Block 57 transforms x d yup d and 2, d into a frame having two of its axes in the image plane 55 using horizontal frame to image frame direction cosines.
  • This direction cosines are defined in terms of the trigonometric functions of lily, 6 and just as the a are made up of terms containing trigonometric functions of 111,, 0,, and (1),.
  • the final step is shown in block 59.
  • flying spot scanner 61 will have an electron gun 63 and horizontal and vertical deflection plates 65 (only the vertical plates are shown). Electrons emitted by gun 63 will he I deflected by plates 65 and impinge on the face of the flying spot scanner which is coated with phosphor. The light emitted by the phospher surface will pass through film 67 and be collected by lens 69 to be imaged on photomultiplier tube 71 which provides a video signal to display of FIG. 1.
  • the relationship between the voltage on plates 65 and the resulting spot position is well known. Thus, it is only necessary to scale the values of y, and z, obtained in FIG. 2 so that the proper voltages are input to the plates.
  • y and z are the positions on the camera tube of the instantaneous ground intersection point. Thus, it is only necessary to drive the scan on the camera tube to that point which corresponds to the instantaneous line of sign associated with a display CRTs electron beam.
  • the image 73 which could be a projected film image or a computer generated image, or other image on a screen (or CRT), is imaged on camera tube 75 through lens 77. Since the position on image 73 is known, but not the position on tube 75, it is necessary to multiply y, and z; by the ratio of image to object distance in the system to obtain the values used in scanning tube 75.
  • FIG. 6 shows a typical embodiment of raster computer 17 of FIG. 1.
  • Sweep generator 81 will have an input on line 83 from the sync generator 19 of FIG. 1 to synchronize it with the display 15. If the display is planar, as assumed for block 33 of FIG. 2, the sweeps generated represent tan (0 and tan ⁇ p This may be done by generating a normal TV type linear sweep since, with the distance to the center of the display fixed, the tangenets of (0w) and 41 will correspond directly to the X and Y positions of the spot on the display.
  • the outputs of sweep generator 81 are inputs to block 85, a transformation apparatus.
  • This apparatus comprises three servos each driving sine-cosine potentiometers.
  • the three servos correspond to 41 0 4), and are driven by inputs corresponding to these values from computer 23.
  • the computation done in this block is equivalent to that of blocks 27 and 33 of FIG. 2 combined.
  • the servo driven potentiometers are connected together to perform the required multiplications. A system which describes how such multiplications are performed is shown in U.S. Pat. No. 3,003,252 granted to E. G. Schwarm on Oct. 10,1961.
  • the outputs from block 85 are inputs to a similar transformation block 87 which has servo inputs
  • This block will do the computations of the combined blocks 25 and 35 of FIG. 2.
  • Two of the outputs of block 87, d and d are multiplied by h obtained from computer 23 in multipliers 89 and 91 respectively.
  • Values of Ax, Ay and h,- also obtained from computer 23 are respectively multiplied by the third output of block 87 (d;,) in multipliers 93, 95 and 97. (All multipliers may be Analog Devices Model 422] or their equivalent).
  • the constantf of block 59 of FIG. 2 may be included in this computation thus causing dividers 105 and 107 to have respective outputs representing the y, and z, of block 59 of FIG. 2. These outputs are then used as inputs to scanned raster device 13 of FIG. 1.
  • the dividers used may be constructed using the instructions given on the data sheet for Analog Devices Multiplier Model 422.] published by Analog Devices of Norwood, Mass.
  • the matrix multiplications are done using servo multipliers.
  • the an, matrix of block 27 of FIG. 2 and the a matrix of block 25 may be multiplied in the simulator computer, in which case only one set of angles and thus only one block 85 or 87 would be required in the embodiment of FIG. 6.
  • FIG. 7 Because of screen shape it is often desirable to roll the image optically. However, since the equations implicitly take roll into account, if optical roll is used, derotation in the raster computer is required.
  • the circuits of FIG. 7 perform the function of a resolver transforming the coordinates y, and Z in one axis system to the coordinates y and 1 in an axis system rotated an angle from the original system.
  • Values of sin and cos (1), are obtained from computer 23 and the values y, sin 4) y, cos 1, sin di and z, cos (11,,, obtained in multipliers 111, 113, 115, and 117.
  • summing amplifier 119 y is found by adding z
  • sin (11,, and y, cos 42 and in amplifier 121 Z is found by adding z, cos and y, sin 42, (Signs are inverted through amplifiers 119 and 121.) In this manner optical roll, for example, is compensated for in the camera raster computer output.
  • FIG. 6 An examination of FIG. 6 shows that a relatively large number of multiplications and transformations must be done in the raster computer. Each function performed contributes to the noise in the system with the analog multipliers causing the greatest problems because of internal noise generation. Thus, it is desirable to have as few functions performed in the raster computer as possible.
  • blocks 33 and 35 may be combined by doing further computation in the digital computer. It is possible to go even further and combine not only blocks 33 and 35 but also 51 and 57 to end up with one matrix multiplication. Such an arrangement is shown in FIG. 8.
  • Sweep generator 81 provides the lb and 0 to block 123 where g g and g are computed.
  • the equations of block 33 of FIG. 2 are for a flat display and tangent functions used.
  • the equations for a spherical display are used. If block 123 were computing for a flat display the equations would be g I, g tan 111 and g tan (0 These quantities go to block 125 where A B and C are computed from the g s and mjs. These two computations replace all those shown in blocks 33, 35,51, and 57 of FIG. 2.
  • the rr s are found in the digital computer 23 using the quantities in the above mentioned blocks of FIG. 2.
  • the final block 127 corresponds to block 59 of FIG. 2.
  • the precise way of combining all the various transformations is not shown as it will be well within the capability of those skilled in the art to derive the equations for the 'lTu S.
  • Sweep generator 81 is the type previously described in connection with FIG. 6.
  • the gfs are obtained using the types of multipliers previously mentioned in describing FIG. 6 to obtain g, and g and an operational amplifier to invert sin 0 for g;,.
  • Blocks 131 are multiplying digital-to-analog converters such as Model 2254 available from Data Device Corporation of Hicksville, NY.
  • the quantities developed by the computer 23 were required to be converted to analog quantities before being used. This resulted in any noise on the analog lines being further amplified by the analog multipliers.
  • By using the digital signals directly as multiplying D/A inputs significant noise reduction is possible.
  • the multiplied 'rr g, quantities are summed in amplifiers 133 to obtain A B and C
  • the final outputs y and z are obtained by dividing B and C by A in block 135. (Basically the same computation as was done in blocks and 107 of FIG. 6.)
  • the bloclc 137, where the M are computed, 139, where the A B and C are computed, and 141, where the and z ⁇ are computed are the equivalents of blocks 123, and 127 of FIG. 8.
  • a block 142 wherein gfs are computed for use in block 137 and 139 is required.
  • digital computer 23 computes both the rr s and rr 's.
  • the final step, of integration, which provides the filtering to reduce noise is shown in block 143.
  • Initialization might also be done only each field or frame if the integrators used are accurate enough. A line by line initialization, however, assures that each line will start at the same azimuth independent of integrator accuracy. It should also be noted that the initial values need not be computed in real-time and may thus be precomputed and stored. A particular imple mentation of these equations is not shown as the techniques of FIG. 6 and 8, along with other well known analog methods, may be used in implementation as will be recognized by those skilled in the art.
  • FIG. 11 Another set of equations which provides a raster computer which is simpler and more noise-free than that of FIG. 6 is shown in FIG. 11. This set of equations allows the type of servo multipliers described in connection with FIG. 6 to be used in matrix multiplications. It will be recognized that the m, used in the equations of FIGS. 8 and do not lend themselves to use with servos and thus multipliers were required.
  • Block 123 the g s are computed as before (in FIGS. 8 and 10).
  • block 151 dfs are computed in a manner similar to that done in block 33 of FIG. 2 (block 87 of FIG. 6).
  • Block 153 is essentially the same as block 51 of FIG. 2. Additional digital computer computations have been used to provide X Y and H to eliminate some of the analog multiplications associated with block 51 of FIG. 2.
  • Block 155 is the same as block 57 of FIG. 2 except that, instead of finding film image plane coordinates, the scanned raster coordinates are found directly.
  • the lllw and G will then define the spot position with respect to the center of the moving window. To reference (11 and 0 to this fixed frame it is then only necessary to add the latitude and longitude (of the center of the moving window) respectively to ill and 0 and then take the sines and cosines of the resulting angular sums in order to find the direction cosines of the instantaneous line of sight.
  • a display system for presenting to an observer a desired simulated scene of the earths surface as viewed from the observers viewpoint, comprising an image source depicting a portion of the earths surface as viewed from an image viewing point, at least part of which scene contains the same information as that contained in the desired scene; a device with a controllable first spot for scanning the image source to develop a video signal, a display located within the observers l field of view to form a simulated window through which the observer may view said simulated scene, said display being of the type formed by scanning a second spot across the display to form a raster and modulating said second spot with the video signal developed by said device, a method of driving said first spot to obtain an image of the desired scene in proper perspective on said display comprising:
  • said display is a wide angle spherical display having a fixed frame of reference, only a relatively small portion of which is modulated by said video signal, the center of said portion is movable and may be defined by a latitude and longitude, said second set of direction cosines are the fixed display frame to body axes direction cosines; and the direction of said line in said display frame is obtained by adding the scan wave forms of said second spot to said latitude and longitude.
  • a display system for presenting to an observer a desired simulated scene of the earths surface as viewed from the observer's viewpoint, comprising an image source depicting a portion of the earths surface as viewed from an image viewing point, at least part of which scene contains the same information as that con tained in the desired scene; a device with a controllable first spot for scanning the image source to develop a video signal, a display located within the observers field of view to form a simulated window through which the observer may view said simulated scene, said display being of the type formed by scanning a second spot across the display to form a raster and modulating said second spot with the video signal developed by said device, apparatus for driving said first spot to obtain an image of the desired scene in proper perspective on said display comprising:
  • said device is a flying spot scanner, pickup photomultiplier tube and associated optics and wherein said device is arranged to scan said frame.
  • said image source is the image obtained from an optical probe viewing a model
  • said model is a portion of the earths surface
  • said device is a TV camera on which said image is focused.

Abstract

A general method of providing perspective transformations in a visual display system having an image generated by a scanned raster device such as a CRT or television projector is shown. The television display is a window out of which an observer views a simulated picture of terrain. The line of sight from the observer passing through the instantaneous spot position on the window is used to find a ground intersection point, the location of this point on an image source, such as film, is found and a video signal representing that image is generated by positioning the scan of an image pick up device to that point on the image source.

Description

UIlitQd States Patent 1 [111 3,725,563
Woycechowsky [4 1 Apr. 3, 1973 TRANSFORMATION IN SCANNED UNITED STATES PATENTS RASTER VISUAL DISPLAY 3,098,929 7/1963 Kirchner ..343/7.9 X [75] Inventor: Brian J w y h ky, B gham- 3,261,912 7/1966 Hemstreet ..l78/DlG. 20 ton, N.Y. 3,060,596 10/1962 Tucker et al. ..35/l0.2
[73] Assignee: The Singer Company, Binghamton, Primary Examiner Maco|m A Morrison Assistant Examiner-R. Stephen Dilidine, Jr. [22] Filed; 23, 7 AttorneyFrancis L. Masselle et al.
[ pp 211,372 57 ABSTRACT Related US. Application Data A general method of providing perspective transfor- I mations in a visual display system having an image {63] commumomn'pan of generated by a scanned raster device such as a CRT or 1971 abaldoned television projector is shown. The television display is a window out of which an observer views a simulated Cl 5 35/12 35, picture of terrain. The line of sight from the observer passing through the instantaneous spot position on the [51] Int. Cl ..G09b 9/08, H04n 3/30, G015 7/20 window is used to find a ground intersection point, the
[58] Field of Search..l78/DIG. 20, DIG. 35; 35/l0.2, location of this point on an image source, such as film, 35/12 N; 235/186; 343/79 is found and a video signal representing that image is generated by positioning the scan of an image pick up device to that point on the image source.
19 Claims, 11 Drawing Figures e .fi
RAsTER -----fi coMPUTER SYNC. L H SIMULATOR W f COMPUTER IMAGE i o C 23 SOURCE M SCANNED V RASTER DISPLAY DEVICE I N VEN TOR:
B f' y QM 3. Wm
PATENTEDAPR3 197a SHEET U20F 1o PATENTEUAPRB 197a SHEET 03 HF 10 ATTORNEY PATENTEnAPRs I975 3 725,553
SHEET UHUF 1O V|DEO FIG. 5
' yNVENTOR.
QLNSLL @SMfi ATTORNEY x X X X Z 0 Q (0 u u. I z 8 N l i U) I PATENTEDAPR3 I973 7 5,5553
,vnv
INVENT R:
" 'B- WM ATTORNEY PATENTHJ-Am 197a SHEET USUF 1O I N VEN TOR.
ATTORNEY METHOD OF PERSPECTIVE TRANSFORMATION IN SCANNEI) RASTER VISUAL DISPLAY This invention relates to visual systems in general and more particularly to a method and apparatus for raster shaping to alter a perspective point in a visual system and is a continuation-in-part of application Ser. No. 134,238 filed Apr. 15, 1971; and now abandoned.
Visual systems for use in aircraft simulators and other types of trainers have gained widespread use due to the increased cost of training in an actual aircraft or other vehicle or device. In general, four basic types of visual systems have been used, one of which is a camera model system in which a probe containing a TV camera is moved over a scale terrain model in accordance with computed attitude and position of the simulator. The resulting image is displayed to the trainee with a TV projector or CRT.
A second type system is the film-based system in which a predetermined path is flown by an aircraft and a motion picture recorded. The motion picture is then shown to the trainee as he flies the same path. Deviations may be simulated by optical distortion as is shown in patents granted to H. S. I-Iemstreet such as U.S. Pat. No. 3,233,508 granted on Feb. 8, 1966 and U.S. Pat. No. 3,261,912 granted on July 19, 1966. Also disclosed therein is a variation of the system of the present invention in which the film image is viewed by a TV camera and the resulting image projected via TV projector or CRT. Distortion in that case is accomplished by raster shaping.
A third type of system is a scan-transparency system wherein an image is generated by scanning a transparency containing othophotographic information. The information generated is displayed via TV as in the previous example. Such a system is shown in U.S. Pat. No. 3,439,105, granted to W. C. Ebeling et al. on Apr. 15, 1969.
A fourth type system is a digital image generation system. In such a system image information is stored in a computer which selects the desired information for display in a TV type display.
A variation of the film based system viewed by a TV camera is a film based system scanned by a flying spot scanner to generate an image. The film based system and the scan transparency system have in common an important aspect. The recorded information on them is from a specific viewpoint. To produce a scene as it would appear if viewed from another viewpoint requires raster shaping. Although the camera model and digital systems do not have this restriction there may be cases where it is desired to cause a change in viewpoint by raster shaping rather than moving the camera probe or reconstructing the digital image. For example, in the former case problems arise as the probe gets close to the model. In the later, construction of images uses considerable computer time.
The present invention provides a system which may be used for raster shaping in any visual system where it is desired to transform an image containing information as viewed from one viewpoint to an image which appears as if viewed from another viewpoint.
It is the object of this invention to provide a general system which controls the shape of a raster in a visual simulation system such that a desired perspective change is achieved.
Other objects of the invention will in part be obvious and will in part appear hereinafter.
The invention accordingly comprises the several steps and the relation of one or more of such steps with respect to each of the others, and the apparatus embodying features of construction, combination(s) of elements and arrangement of parts which are adapted to effect such steps, all as exemplified in the following detailed disclosure, and the scope of the invention will be indicated in the claims.
For a fuller understanding of the nature and objects of the invention reference should be had to the following detailed descriptions taken in connection with the accompanying drawings, in which:
FIG. 1 is a block diagram of a preferred embodiment of the invention in combination with an aircraft simulator;
FIG. 2 is a flow diagram of a preferred set of equations for use with the invention;
FIG. 3 is a perspective view of the relationship between an observers view through the display and the view on the image source;
FIG. 4 is a schematic view of a first type of scanned raster device;
FIG. 5 is a schematic view of a second type of scanned raster device;
FIG. 6 is a block diagram of a preferred embodiment of a raster computer for implementing the equations of FIG. 2;
FIG. 7 is a block diagram of a modification to the embodiment of FIG. 6 for compensating for image roll in those systems where it is desirable to roll the image before raster shaping is introduced;
FIG. 8 is a block diagram showing a second form of the equations of FIG. 2;
FIG. 9 is a block diagram of the implementation of the equations of FIG. 8;
FIG. 10 is a block diagram of the equations of FIG. 8 in rate rather than position form; and
FIG. 11 is a block diagram of a third form which the equations of FIG. 2 may take.
FIG. 1 is a basic representation of the systems with which the present invention may be used. Block 11 is an image source. It may be an image recorded on a frame of film, the image picked up by a probe in a camera model system, a digitally generated image, an orthophoto or other image. Block 13 is a scanned raster device. It may be a TV camera viewing image source 1 l or a flying spot scanner device scanning image source 1 1 to produce a video signal.
Display 15 may be one or more TV projectors, CRTs, laser projectors or other similar devices capable of projecting a video signal. Raster computer 17 is the system of the present invention which shapes the raster of device 13 to obtain the desired perspective. Sync generator 19 provides sync commands to synchronize the scans of raster device 13 and display 15.
In each case the image presented by block 11 will represent a scene as it would appear from a predetermined viewpoint. If it is film it will be as viewed from the location of the taking camera; if a probe image it will depend on the probe position and altitude. Likewise, if an othophoto it will appear as a map view from a certain altitude and if digitally generated will represent a view based on computer inputs. In each case, however, the viewpoint of the image is known.
Examining the balance of FIG. I will further show the problem to be solved by the present invention. Display is in a position to be viewed by a trainee in simulator cockpit 21. This cockpit will contain controls and instruments duplicating those of the actual aircraft being simulated. Control movements will be supplied as inputs to computer 23 which will use these inputs in equations of motion to compute the aircraft state vector (position, attitude, velocity). From this computed data, outputs from computer 23 drive the instruments in cockpit 21 such as altimeter, airspeed, etc.
The state vector information of the aircraft is available in computer 23 and may be used along with the information concerning the viewpoint from which the image was made by raster computer 17 in determining proper raster shape. This viewpoint information is contained in block 11 and is provided to raster computer 17 and/or simulator computer 23. These two computers work together, as will be explained later.
The information may, for example, be recorded on the film and picked up by a device in block 11 in a film based system. In a camera model system the position and attitude of the probe will be available. In a scan transparency system, the scale of the orthophoto will be known; and in a computer generated image, the inputs used in constructing the image will be known. Thus, computers 17 and 23 have available the state vector of the simulated aircraft and the state vector of the image present in image source 11. This information will of course be constantly updated as the simulator flies and as the image changes due to film advancement, probe movement, etc.
A third type of information is used in the present invention. This is the instantaneous position of the scanning spot on the display as referenced to the eyepoint of the trainee. This information is known indirectly through sync generator 19 which controls the scanning of the spot on display 15. For an explanation of how the display raster may be made quite accurate see US. application Ser. No. 130217 filed by R. F. H. McCoy et al. on Apr. 1, 1971 and assigned to the same assignee as the present invention.
In general terms, it is known from the sync command 19 when the sweep is started and the characteristics of the sweep are known. From this information it is possible to compute the instantaneous spot position as will be shown in more detail later.
Using these items of information, i.e., the aircraft state vector, the image source state vector, and the instantaneous spot position, along with the relationship between the aircraft body axes and the display axes, it is possible to compute the intersection of a line from the pilot's nominal eye-point passing through the instantaneous spot position with the ground, to then determine where or if that point intersects the image source, and then to position the scan of device 13 to that spot.
FIG. 2 shows a flow diagram of the computations. From the state vector of the simulated aircraft, the rotation of the aircraft with respect to a horizontal frame of reference is known. These rotations are 0,, the simulated pitch angle; (1),, the simulated roll angle; and 111,, the simulated heading angle. From these angles, computer 23 of FIG. 1 may compute the sines and cosines of the angles; and from the sines and cosines, the direction cosines of the simulated aircraft body to ground reference axes. This computation is shown in block 25 of FIG. 2 and results in a matrix.
Computer 23 may also compute the direction cosines of the window axes referenced to the body axes, m from ill the window heading with respect to the body axes; 0 the window pitch with respect to the body axes; and da the window roll with respect to the body axes. The computation required to evaluate the w corresponds to the a computation shown in block 25. If the window axes are fixed with respect to the body axes, the w are constant and therefore need not be continuously computed. The evaluation of the w is indicated in block 27.
In general, the simulated eyepoint is located some distance away from the simulated center of gravity. In situations where the eyepoint displacement is significant (e.g., takeoff and landing situations for transport aircraft), the eyepoint displacement with respect to the center of gravity must be taken into account. The components of eyepoint displacement with respect to the center of gravity are referenced to the horizontal frame of reference by multiplying the body axes coordinates of the eyepoint (x yggp, 2 by the 04,, matrix. The evaluation of the horizontal frame components of the eyepoint with respect to the center of gravity (x y 2 is shown in block 29. Eyepoint altitude with respect to the horizontal plane of reference, hay), is also computed in block 29 by subtracting z from the altitude of the simulated aircraft, h,
Horizontal frame of reference components of eyepoint position relative to image position (Ax and Ay) are found by respectively adding x and y to the horizontal frame components of the simulated aircrafts center of gravity (x and y,,) and then subtracting the image position coordinates (x and y This computation is shown in block 31 of FIG. 2.
The altitude associated with a frame of film (hp) or with a probe in a camera model system, etc.; and image attitude, represented by di 0 and (fa are provided directly to the camera raster computer 17 of FIG. 1 by the image source 11.
The remainder of the computations must be done in raster computer 17 of FIG. 1, which is an analog computer, due to the fact that computations are being done for an instantaneous spot position. The angles 111 and (BWI representing the coordinates of the instantaneous spot position as viewed from the pilots eyepoint, are generated in a manner to be described later. For a rectangular window located a unit's distance from a nominal pilot's eyepoint, the window axes coordinates of the instantaneous spot position are 1, tan 111 tan (9w),.. These quantities are transformed to the body axes in block 33 through the use of the (n matrix to obtain the direction lines of a line passing through the instantaneous spot position referenced to the body frame (e 2 1 If the screen is spherical rather than planar, this line is defined by the direction cosines: cos 111 cos 0 sin ill cos 0 and sin 0 where '11,, and 0 are the respective window referenced longitude and latitude of the instantaneous spot position. These direction lines must be transformed to the horizontal reference system. This is done by multiplying them by the a matrix in block 35. The m matrix and a matrix may be combined to form the window to body axes matrix prior to entering the analog computer in which case blocks 33 and 35 would be combined.
Referring now to FIG. 3, point 37 is a fixed point on the ground which is the reference for x y and x,, y,. The X and Y position of the simulated eyepoint 39 with respect to the image axes 41, Ax and Ay, are also shown. Line 43 is the line passing through the instantaneous spot 45 on display face 47. Since the direction lines of line 43 and the eyepoint altitude have been obtained, it is now possible to find the horizontal components (h (1 /11 and It d /d of line 43. This computation is done in block 51 ofFIG. 2 where they are added to the horizontal components of the eyepoint with respect to the image position. The results of the computation done in block 51 of FIG. 2 are the horizontal components of the ground intersection point 49 of line 43 with respect to the image position. If the ground is assumed to be horizontal, the vertical component of the ground intersection point 49 with respect to the image position is the altitude of the image position h However, these components are multiplied by d in block 51 in order to avoid divisions by d;,. It can be seen from FIG. 2 that when the image coordinates are obtained in block 59, the multiplications of these components by d are cancelled.
Now that the ground intersection point 49 of FIG. 3 is known, referenced to the image axes 41, it is only necessarytodetermine the image plane coordinates of the intersection of line 53 (the line from the image axis origin 41 to ground intersection point 49) and the image plane 55. This is done in blocks 57 and 59. Block 57 transforms x d yup d and 2, d into a frame having two of its axes in the image plane 55 using horizontal frame to image frame direction cosines. This direction cosines are defined in terms of the trigonometric functions of lily, 6 and just as the a are made up of terms containing trigonometric functions of 111,, 0,, and (1),. The final step is shown in block 59. By similar triangles the Y and Z co-ordinates in the image plane, d and 2 are found from x d y d;,, and 1 d;,. In a film system,fis the local length of the taking camera and hence block 59 shows the value off multiplying y da/X d and 1p d /x d If the image is obtained from a camera viewing a model, the focal length will again be equal to an appropriate focal length. In a digitally generated image, this value will be stored, since it is used in the generation of the image.
Knowing where the ground intersection point is located on the image source 11, it is only necessary to position the spot of scanning device 13 so that it intersects that point on the image. For example, if device 13 of FIG. 1 is a flying spot scanner and the image is on film, a system such as FIG. 4 would be used. Flying spot scanner 61 will have an electron gun 63 and horizontal and vertical deflection plates 65 (only the vertical plates are shown). Electrons emitted by gun 63 will he I deflected by plates 65 and impinge on the face of the flying spot scanner which is coated with phosphor. The light emitted by the phospher surface will pass through film 67 and be collected by lens 69 to be imaged on photomultiplier tube 71 which provides a video signal to display of FIG. 1. The relationship between the voltage on plates 65 and the resulting spot position is well known. Thus, it is only necessary to scale the values of y, and z, obtained in FIG. 2 so that the proper voltages are input to the plates.
Thus, as a spot moves across display 15 of FIG. 1 its associated instantaneous ground intersection point will be computed and used to find where that spot is on the film. This information will then be used to drive the flying spot scanner spot to that point resulting in the proper ground point being displayed on display 15 for ail points in time. The same system would be used if 67 were on orthophoto rather than a normal frame of movie film. The raster shape would differ but the computation and driving of the spot would be the same.
In a camera model system, y and z, are the positions on the camera tube of the instantaneous ground intersection point. Thus, it is only necessary to drive the scan on the camera tube to that point which corresponds to the instantaneous line of sign associated with a display CRTs electron beam.
If the system is one where TV camera is viewing an image as shown on FIG. 5, one additional step is necessary. The image 73, which could be a projected film image or a computer generated image, or other image on a screen (or CRT), is imaged on camera tube 75 through lens 77. Since the position on image 73 is known, but not the position on tube 75, it is necessary to multiply y, and z; by the ratio of image to object distance in the system to obtain the values used in scanning tube 75.
FIG. 6 shows a typical embodiment of raster computer 17 of FIG. 1. Sweep generator 81 will have an input on line 83 from the sync generator 19 of FIG. 1 to synchronize it with the display 15. If the display is planar, as assumed for block 33 of FIG. 2, the sweeps generated represent tan (0 and tan \p This may be done by generating a normal TV type linear sweep since, with the distance to the center of the display fixed, the tangenets of (0w) and 41 will correspond directly to the X and Y positions of the spot on the display. If a spherical display is involved, sines and cosines of 0 and (11 and the direction cosines costp cos 0 sin \Ilw cos 6 and sin 0 must be generated. Apparatus to generate such scans is disclosed in U.S. application Ser. No. 108446 filed by T. Cwynar et al. on Jan. 21, I971.
The outputs of sweep generator 81 are inputs to block 85, a transformation apparatus. This apparatus comprises three servos each driving sine-cosine potentiometers. The three servos correspond to 41 0 4), and are driven by inputs corresponding to these values from computer 23. The computation done in this block is equivalent to that of blocks 27 and 33 of FIG. 2 combined. The servo driven potentiometers are connected together to perform the required multiplications. A system which describes how such multiplications are performed is shown in U.S. Pat. No. 3,003,252 granted to E. G. Schwarm on Oct. 10,1961.
The outputs from block 85 are inputs to a similar transformation block 87 which has servo inputs |,l|,, 0,, and (1),. This block will do the computations of the combined blocks 25 and 35 of FIG. 2. Two of the outputs of block 87, d and d are multiplied by h obtained from computer 23 in multipliers 89 and 91 respectively. Values of Ax, Ay and h,- also obtained from computer 23 are respectively multiplied by the third output of block 87 (d;,) in multipliers 93, 95 and 97. (All multipliers may be Analog Devices Model 422] or their equivalent).
ln summing amplifier 99 the h d, output from multiplier 89 is added to the Ax d output of multiplier 93 and in summing amplifier 101 the h d output of multiplier 91 is added to the Ay 'd output of multiplier 95. The h d output of multiplier 97 and the outputs of amplifiers 99 and 101 (h 11 Ax 'd and It d Ay d form the x d y d;, and z d of block 51 of FIG. 2.
These three signals are inputs to block 103, another transformation block similar to blocks 85 and 87, wherein the computations of block 57 of FIG. 2 are performed resulting in x d y; d and 2 d;,. The servo inputs from computer 23 in this case are ill O and (p The y d and x (1 are provided as inputs to divider 105 and 1F d and x d to divider 107. By
scaling using normal analog techniques the constantf of block 59 of FIG. 2 may be included in this computation thus causing dividers 105 and 107 to have respective outputs representing the y, and z, of block 59 of FIG. 2. These outputs are then used as inputs to scanned raster device 13 of FIG. 1. The dividers used may be constructed using the instructions given on the data sheet for Analog Devices Multiplier Model 422.] published by Analog Devices of Norwood, Mass.
As shown in FIG. 6 the matrix multiplications are done using servo multipliers. It should be noted that the an, matrix of block 27 of FIG. 2 and the a matrix of block 25 may be multiplied in the simulator computer, in which case only one set of angles and thus only one block 85 or 87 would be required in the embodiment of FIG. 6. It is also possible to compute the required sines and cosines in computer 23 and perform the matrix multiplications using additional multipliers similar to blocks 89, 91, etc.
Various modifications may be made without departing from the principles of the invention. One such modification is shown in FIG. 7. Because of screen shape it is often desirable to roll the image optically. However, since the equations implicitly take roll into account, if optical roll is used, derotation in the raster computer is required.
Basically, the circuits of FIG. 7 perform the function of a resolver transforming the coordinates y, and Z in one axis system to the coordinates y and 1 in an axis system rotated an angle from the original system. Values of sin and cos (1),, are obtained from computer 23 and the values y, sin 4) y, cos 1, sin di and z, cos (11,, obtained in multipliers 111, 113, 115, and 117. In summing amplifier 119 y is found by adding z, sin (11,, and y, cos 42 and in amplifier 121 Z is found by adding z, cos and y, sin 42, (Signs are inverted through amplifiers 119 and 121.) In this manner optical roll, for example, is compensated for in the camera raster computer output.
An examination of FIG. 6 shows that a relatively large number of multiplications and transformations must be done in the raster computer. Each function performed contributes to the noise in the system with the analog multipliers causing the greatest problems because of internal noise generation. Thus, it is desirable to have as few functions performed in the raster computer as possible.
The only variables changing at a rate which requires the use of analog computations are ill and ra It was previously noted that blocks 33 and 35 may be combined by doing further computation in the digital computer. It is possible to go even further and combine not only blocks 33 and 35 but also 51 and 57 to end up with one matrix multiplication. Such an arrangement is shown in FIG. 8.
Only three blocks of computation are shown being done at fast computation rates in the analog raster computer. Sweep generator 81 provides the lb and 0 to block 123 where g g and g are computed. The equations of block 33 of FIG. 2 are for a flat display and tangent functions used. In block 123 the equations for a spherical display are used. If block 123 were computing for a flat display the equations would be g I, g tan 111 and g tan (0 These quantities go to block 125 where A B and C are computed from the g s and mjs. These two computations replace all those shown in blocks 33, 35,51, and 57 of FIG. 2. The rr s are found in the digital computer 23 using the quantities in the above mentioned blocks of FIG. 2. The final block 127 corresponds to block 59 of FIG. 2. The precise way of combining all the various transformations is not shown as it will be well within the capability of those skilled in the art to derive the equations for the 'lTu S.
The implementation of these equations is shown in FIG. 9. Sweep generator 81 is the type previously described in connection with FIG. 6. In block 129 the gfs are obtained using the types of multipliers previously mentioned in describing FIG. 6 to obtain g, and g and an operational amplifier to invert sin 0 for g;,. Blocks 131 are multiplying digital-to-analog converters such as Model 2254 available from Data Device Corporation of Hicksville, NY. In the implementation of FIG. 6 the quantities developed by the computer 23 were required to be converted to analog quantities before being used. This resulted in any noise on the analog lines being further amplified by the analog multipliers. By using the digital signals directly as multiplying D/A inputs, significant noise reduction is possible. The multiplied 'rr g, quantities are summed in amplifiers 133 to obtain A B and C The final outputs y and z are obtained by dividing B and C by A in block 135. (Basically the same computation as was done in blocks and 107 of FIG. 6.)
It may be that the noise reduction of the systems of FIGS. 8 and 9 is not sufficient for some purposes. The equations for a system which uses the integration of rates is shown in FIG. 10. Since positions will be obtained using analog integrators a filtering effect will result which should further reduce noise. The equations shown are essentially the rate equivalents of the position equations of FIG. 8.
The bloclc 137, where the M are computed, 139, where the A B and C are computed, and 141, where the and z} are computed are the equivalents of blocks 123, and 127 of FIG. 8. In addition a block 142 wherein gfs are computed for use in block 137 and 139 is required. And as indicated digital computer 23 computes both the rr s and rr 's. The final step, of integration, which provides the filtering to reduce noise is shown in block 143.
As with any integration initial values are required. The method of obtaining these values is shown in blocks 145, 147 and 149 in the lower part of FIG. 10. The system is initialized for each horizontal line. Thus in b k (.Lihs are computed for a line beginning at a value of {11w 30 (in a particular embodiment. In
other embodiments another proper constant defining the azimuth of the starting position would be used.) Thus gfs for each line based on the constant 30 and the 6 associated with a given line are computed. In 147, (A (B( and (C9,, for these starting points are computed and in block 149 the Us), and (z are computed. These three blocks are the same as blocks 123, 125 and 127 of FIG. 8 except that instead of computing continuous values, they only compute the initial starting point of horizontal lines.
Initialization might also be done only each field or frame if the integrators used are accurate enough. A line by line initialization, however, assures that each line will start at the same azimuth independent of integrator accuracy. It should also be noted that the initial values need not be computed in real-time and may thus be precomputed and stored. A particular imple mentation of these equations is not shown as the techniques of FIG. 6 and 8, along with other well known analog methods, may be used in implementation as will be recognized by those skilled in the art.
These last two sets of equations, although offering many advantages, have certain disadvantages in cost due to the large number of technically sophisticated components. Another set of equations which provides a raster computer which is simpler and more noise-free than that of FIG. 6 is shown in FIG. 11. This set of equations allows the type of servo multipliers described in connection with FIG. 6 to be used in matrix multiplications. It will be recognized that the m, used in the equations of FIGS. 8 and do not lend themselves to use with servos and thus multipliers were required.
In block 123 the g s are computed as before (in FIGS. 8 and 10). In block 151 dfs are computed in a manner similar to that done in block 33 of FIG. 2 (block 87 of FIG. 6). Here in effect the w s of block 33 and the a s of block 35 of FIG. 2 have been combined into a matrix composed of 111 6 and terms. Block 153 is essentially the same as block 51 of FIG. 2. Additional digital computer computations have been used to provide X Y and H to eliminate some of the analog multiplications associated with block 51 of FIG. 2. Block 155 is the same as block 57 of FIG. 2 except that, instead of finding film image plane coordinates, the scanned raster coordinates are found directly. (This is also true in the equations of FIGS. 8 and 10.) The final step in block 157 corresponds directly to that of block 59 of FIG. 2, again with the exception that y and z rather than y, and z; are obtained. (The f subscript denotes film image plane coordinates and the C subscript scanned raster device coordinates.)
Implementation is essentially the same as that shown in FIG. 6. One of blocks 85 or 87 will be eliminated since 41 0 lb '11,, 0,, and 11, have been combined into 41 0 and dm Multipliers 89 and 91 are eliminated since d, and d are added directly to the products of multipliers 93 and 95 (93, 95 and 97 will now have as inputs X Y and H respectively) and the final circuit output will be y and z rather than y, and z, since the inputs to block 103 will be I11 6 and 42 rather than :11 0 and The equations above assume that the relationship between the center of the window and the angles ilr and 0 remain fixed. Such would be the case in a single fixed display window and in some cases where the center of the display (meaning here the imagry displayed) is allowed to move.
However in certain types of systems the equations described above will have to be varied to achieve the result of always defining the line of sight from the ob servers eye through the instantaneous spot position. For example in the type of system described in application Ser. No. 66729 filed by R. F. H. McCoy on Aug. 25, 1970 wherein a total wide angle spherical display is made up of tiers of narrow angle displays the display raster will be generally made to trace circles of latitude. The center of a high resolution image to be displayed is capable of being positioned anywhere on the display and 111 and 0 which are associated with the high resolution image, define at each point in time latitude and longitude increments referenced to the fixed display frame. The lllw and G will then define the spot position with respect to the center of the moving window. To reference (11 and 0 to this fixed frame it is then only necessary to add the latitude and longitude (of the center of the moving window) respectively to ill and 0 and then take the sines and cosines of the resulting angular sums in order to find the direction cosines of the instantaneous line of sight.
At this point a more detailed explanation seems in order particularly in view of changes required in the equations of FIG. 8 and those following. In FIG. 8 et seq. where g; terms are computed the lb and 0 would have to be changed to (\ll 111 and (S 0 where #1 and 0 represent the respective longitude and latitude of the center of the moving window. In practice it has been found difficult to combine these angles and then take their sines and cosines. This difficulty may be overcome by using the well known trigonometric relationships for finding the sine and cosines of the sum of two angles. Doing this however requires that three additional g, terms be computed.
The additional terms to be computed are:
g, cos 111 sin G 35 sin 111 sin 0 g, cos 6 These are then multiplied by the w s (which must be approximately altered in such a way that properly takes the sines and cosines of 11 and 0 into account) in block 125 of FIG. 8 to result in the following equations:
c "at 81+ :12 82 "as 83 34 84 "as "as 86 These additional terms will of courserequire additional hardware computing elements which may be constructed in the same manner as shown in FIG. 9.
Thus a general method and a number of specific implementations of that method for changing the apparent perspective of an image which is of general application in a visual system utilizing scanned raster devices has been disclosed. A general set of equations and straight forward implementation was first shown and then various improvements which result in increased efficiency and noise reduction disclosed.
Although specific systems which are useful in flight simulators have been disclosed herein, the invention may be used in similar applications such as space simulators, ship simulators, driver trainers, etc.
What is claimed is:
1. In a display system for presenting to an observer a desired simulated scene of the earths surface as viewed from the observers viewpoint, comprising an image source depicting a portion of the earths surface as viewed from an image viewing point, at least part of which scene contains the same information as that contained in the desired scene; a device with a controllable first spot for scanning the image source to develop a video signal, a display located within the observers l field of view to form a simulated window through which the observer may view said simulated scene, said display being of the type formed by scanning a second spot across the display to form a raster and modulating said second spot with the video signal developed by said device, a method of driving said first spot to obtain an image of the desired scene in proper perspective on said display comprising:
a. determining the simulated point of intersection with the earth's surface of a line from the observer passing through the instantaneous position of the second scanning spot on the display window;
b. determining the location on the image source of the depicting thereon of said earth intersection point; and
c. positioning the first spot so that the video signal developed corresponds to said location on said image source.
2. The invention according to claim 1 wherein said image source is optically rotated to simulate rotation of said display window and further including the step of compensating for said rotation whereby the scanning first spot will generally follow a path more nearly approximating a normal raster.
3. The invention according to claim 1 wherein the steps of determining said earth intersection point and said location on said image source comprise:
a. determining a first set of direction cosines of the trainer body axes to a horizontal referenced axis system;
b. using said direction cosines to compute the components of said observers eye position with respect to the simulated center of gravity in said horizontal reference axis system;
c. determining a second set of direction cosines of the display window axes to said body axes;
d. determining from the scan waveforms of said second spot the direction of a line from the observer to the instantaneous position of said second spot in the window axes frame;
. using said second set of direction cosines to determine the direction of a line from the origin of said window axes passing through said second spot with respect to said body axes;
using said first set of direction cosines to reference the direction of said line to the horizontal reference axes;
determining the location of said observers eyepoint with respect to said image viewing point; h. determining from said line referenced to said horizontal axes and the altitude and location of said observers eyepoint with respect to said image viewing point the intersection point with the earths surface referenced to said image viewing point with respect to the horizontal axes system;
i. determining the direction of a line from said image of viewing point to said intersection point with respect to an axes system referenced to the image source;
j. using the direction of said line reference to said image source axes to determine the location on said image source of the depiction of said earth intersection point.
4. The invention according to claim 3 wherein said display is a wide angle spherical display having a fixed frame of reference, only a relatively small portion of which is modulated by said video signal, the center of said portion is movable and may be defined by a latitude and longitude, said second set of direction cosines are the fixed display frame to body axes direction cosines; and the direction of said line in said display frame is obtained by adding the scan wave forms of said second spot to said latitude and longitude.
5. The invention according to claim 3 wherein the steps of determining said first set of direction cosines, using said first set of direction cosines to compute said observers eye position, determining said second set of direction cosines, and determining the location of said observers eyepoint with respect to said image viewing point are performed in a digital computer and the remaining steps performed by analog computing means.
6. The invention according to claim 5 wherein the results of the digital computation are combined into a third set of direction cosines so that the steps of determining the direction of a line through said second spot, referencing said spot to the horizontal reference axes, and determining the intersection point on the earths surface are combined in the digital computer and said third set of direction cosines is then used in an analog computer to determine from said third set and the direction lines of said second spot with respect to the window axes, the direction of a line from said image viewing point.
7. The invention according to claim 5 wherein computations are done using instantaneous position information.
8. The invention according to claim 5 wherein computations in the analog computer are done by the integration of rate information developed in the digital computer and further including a step to periodically initialize the analog computer.
9. The invention according to claim 8 wherein said initialization step is done for each horizontal scan of said second spot.
10. The invention according to claim 3 wherein said first set of direction cosines and said second set of direction cosines are computed and combined in a digital computer to form a fourth set of direction cosines and said fourth set is then used to reference said direction lines of said second spot to the horizontal axes.
11. The invention according to claim 10 wherein the image is rolled optically and further including the step of determining by computation in the digital computer an image source axes rolled by the amount of image roll and using said axes in determining the direction of said line from said image viewing point.
12. In a display system for presenting to an observer a desired simulated scene of the earths surface as viewed from the observer's viewpoint, comprising an image source depicting a portion of the earths surface as viewed from an image viewing point, at least part of which scene contains the same information as that con tained in the desired scene; a device with a controllable first spot for scanning the image source to develop a video signal, a display located within the observers field of view to form a simulated window through which the observer may view said simulated scene, said display being of the type formed by scanning a second spot across the display to form a raster and modulating said second spot with the video signal developed by said device, apparatus for driving said first spot to obtain an image of the desired scene in proper perspective on said display comprising:
a. means for determining the simulated point of intersection with the earths surface of a line from the observer passing through. the instantaneous position of the second scanning spot on the display window;
b. means for determining the location on the image source of the depicting thereon of said earth intersection point; and
0. means for positioning the first spot so that the video signal developed corresponds to said location on said image source.
13. The invention according to claim 12 wherein said display is used in combination with a fixed-base vehicle trainer and said observer is the trainee.
14. The invention according to claim 13 wherein said trainer is an aircraft simulator.
15. The invention according to claim 12 wherein said image source is a frame of a motion picture and said image source viewing point is the point from which said frame was taken.
16. The invention according to claim 15 wherein said device is a TV camera on which said frame is imaged.
17. The invention according to claim 15 wherein said device is a flying spot scanner, pickup photomultiplier tube and associated optics and wherein said device is arranged to scan said frame.
18. The invention according to claim 12 wherein said image source is the image obtained from an optical probe viewing a model, said model is a portion of the earths surface and said device is a TV camera on which said image is focused.
19. The invention according to claim 12 wherein said image source is an orthophoto and said device is a flying spot scanner with associated pickup and optics arranged to scan said orthophoto.
' UNITED STATES PATENT OFFICE CERTIFICATE OF CORRECTION Patent No. 3, 725, 563 Dated April 3. 1973 Inventor s) Brian J. Wovcechowskv It is certified ,that error appears in the above-identified patent and that said Letters'Patent are hereby corrected as shown below:
Column 5, line 15, change "If" to With; line 16, delete "is assumed to be"; line 32', change "This' to --l11ese-. Column 6,- line 13, change 7 "sign" to "scanline 33, change "tangenets" to tangents-; line 38,
after "U. s. insert -Patent No. 3,688, 098 issued on a n--; line 39, change "108446" to -108', 446; change'"fi1ed b to -of-; and change "on" to --fi1eol: line 40, after 1971" delete the period and insert -and assigned to'the same assignee'as the present invention. Column 11, claim 1 line 9, after "signal" change the comma to a semi-colon line 23, change "depicting" to -depiction--.
Signed and sealed this 8th day of January 197M..-
(SEAL) Attest:
EDWARD M.FLETCHER,JR. RENE D. TEGTMEYER Attesting Officer Acting Commissioner of Patents FORM PO-105O 0-69) USCOMM-DC 60376-P69 U.S, GOVERNMENT PRINTING OFFICE: 1969 0-366-334 UNITED STATES PATENT OFFICE CERTIFICATE OF CORRECTION Patent No. 3, 725, 563 Dated April 3. 1973 Inventor s) Brian J. Wovcechowskv It is certified ,that error appears in the above-identified patent and that said Letters'Patent are hereby corrected as shown below:
Column 5, line 15, change "If" to With; line 16, delete "is assumed to be"; line 32', change "This' to --l11ese-. Column 6,- line 13, change 7 "sign" to "scanline 33, change "tangenets" to tangents-; line 38,
after "U. s. insert -Patent No. 3,688, 098 issued on a n--; line 39, change "108446" to -108', 446; change'"fi1ed b to -of-; and change "on" to --fi1eol: line 40, after 1971" delete the period and insert -and assigned to'the same assignee'as the present invention. Column 11, claim 1 line 9, after "signal" change the comma to a semi-colon line 23, change "depicting" to -depiction--.
Signed and sealed this 8th day of January 197M..-
(SEAL) Attest:
EDWARD M.FLETCHER,JR. RENE D. TEGTMEYER Attesting Officer Acting Commissioner of Patents FORM PO-105O 0-69) USCOMM-DC 60376-P69 U.S, GOVERNMENT PRINTING OFFICE: 1969 0-366-334

Claims (19)

1. In a display system for presenting to an observer a desired simulated scene of the earth''s surface as viewed from the observer''s viewpoint, comprising an image source depicting a portion of the earth''s surface as viewed from an image viewing point, at least part of which scene contains the same information as that contained in the desired scene; a device with a controllable first spot for scanning the image source to develop a video signal, a display located within the observer''s field of view to form a simulated window through which the observer may view said simulated scene, said display being of the type forMed by scanning a second spot across the display to form a raster and modulating said second spot with the video signal developed by said device, a method of driving said first spot to obtain an image of the desired scene in proper perspective on said display comprising: a. determining the simulated point of intersection with the earth''s surface of a line from the observer passing through the instantaneous position of the second scanning spot on the display window; b. determining the location on the image source of the depicting thereon of said earth intersection point; and c. positioning the first spot so that the video signal developed corresponds to said location on said image source.
2. The invention according to claim 1 wherein said image source is optically rotated to simulate rotation of said display window and further including the step of compensating for said rotation whereby the scanning first spot will generally follow a path more nearly approximating a normal raster.
3. The invention according to claim 1 wherein the steps of determining said earth intersection point and said location on said image source comprise: a. determining a first set of direction cosines of the trainer body axes to a horizontal referenced axis system; b. using said direction cosines to compute the components of said observers'' eye position with respect to the simulated center of gravity in said horizontal reference axis system; c. determining a second set of direction cosines of the display window axes to said body axes; d. determining from the scan waveforms of said second spot the direction of a line from the observer to the instantaneous position of said second spot in the window axes frame; e. using said second set of direction cosines to determine the direction of a line from the origin of said window axes passing through said second spot with respect to said body axes; f. using said first set of direction cosines to reference the direction of said line to the horizontal reference axes; g. determining the location of said observers'' eyepoint with respect to said image viewing point; h. determining from said line referenced to said horizontal axes and the altitude and location of said observers'' eyepoint with respect to said image viewing point the intersection point with the earth''s surface referenced to said image viewing point with respect to the horizontal axes system; i. determining the direction of a line from said image of viewing point to said intersection point with respect to an axes system referenced to the image source; j. using the direction of said line reference to said image source axes to determine the location on said image source of the depiction of said earth intersection point.
4. The invention according to claim 3 wherein said display is a wide angle spherical display having a fixed frame of reference, only a relatively small portion of which is modulated by said video signal, the center of said portion is movable and may be defined by a latitude and longitude, said second set of direction cosines are the fixed display frame to body axes direction cosines; and the direction of said line in said display frame is obtained by adding the scan wave forms of said second spot to said latitude and longitude.
5. The invention according to claim 3 wherein the steps of determining said first set of direction cosines, using said first set of direction cosines to compute said observer''s eye position, determining said second set of direction cosines, and determining the location of said observer''s eyepoint with respect to said image viewing point are performed in a digital computer and the remaining steps performed by analog computing means.
6. The invention according to claim 5 wherein the results of the digital computation are combined into a third set of direction cosines so that the steps of determining the direction of a line through said second spot, referencing said spot to the horizontal reference axes, and determining the intersection point on the earth''s surface are combined in the digital computer and said third set of direction cosines is then used in an analog computer to determine from said third set and the direction lines of said second spot with respect to the window axes, the direction of a line from said image viewing point.
7. The invention according to claim 5 wherein computations are done using instantaneous position information.
8. The invention according to claim 5 wherein computations in the analog computer are done by the integration of rate information developed in the digital computer and further including a step to periodically initialize the analog computer.
9. The invention according to claim 8 wherein said initialization step is done for each horizontal scan of said second spot.
10. The invention according to claim 3 wherein said first set of direction cosines and said second set of direction cosines are computed and combined in a digital computer to form a fourth set of direction cosines and said fourth set is then used to reference said direction lines of said second spot to the horizontal axes.
11. The invention according to claim 10 wherein the image is rolled optically and further including the step of determining by computation in the digital computer an image source axes rolled by the amount of image roll and using said axes in determining the direction of said line from said image viewing point.
12. In a display system for presenting to an observer a desired simulated scene of the earth''s surface as viewed from the observer''s viewpoint, comprising an image source depicting a portion of the earth''s surface as viewed from an image viewing point, at least part of which scene contains the same information as that contained in the desired scene; a device with a controllable first spot for scanning the image source to develop a video signal, a display located within the observer''s field of view to form a simulated window through which the observer may view said simulated scene, said display being of the type formed by scanning a second spot across the display to form a raster and modulating said second spot with the video signal developed by said device, apparatus for driving said first spot to obtain an image of the desired scene in proper perspective on said display comprising: a. means for determining the simulated point of intersection with the earth''s surface of a line from the observer passing through the instantaneous position of the second scanning spot on the display window; b. means for determining the location on the image source of the depicting thereon of said earth intersection point; and c. means for positioning the first spot so that the video signal developed corresponds to said location on said image source.
13. The invention according to claim 12 wherein said display is used in combination with a fixed-base vehicle trainer and said observer is the trainee.
14. The invention according to claim 13 wherein said trainer is an aircraft simulator.
15. The invention according to claim 12 wherein said image source is a frame of a motion picture and said image source viewing point is the point from which said frame was taken.
16. The invention according to claim 15 wherein said device is a TV camera on which said frame is imaged.
17. The invention according to claim 15 wherein said device is a flying spot scanner, pickup photomultiplier tube and associated optics and wherein said device is arranged to scan said frame.
18. The invention according to claim 12 wherein said image source is the image obtained from an optical probe viewing a model, said model is a portion of the earth''s surface and said device is a TV camera on which said image is focused.
19. The invention according to claim 12 wherein said image source is an orthophoto and said device is a flying spot scanner with associated pickup and optics arranged to scan said orthophoto. >
US00211372A 1971-12-23 1971-12-23 Method of perspective transformation in scanned raster visual display Expired - Lifetime US3725563A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US21137271A 1971-12-23 1971-12-23

Publications (1)

Publication Number Publication Date
US3725563A true US3725563A (en) 1973-04-03

Family

ID=22786663

Family Applications (1)

Application Number Title Priority Date Filing Date
US00211372A Expired - Lifetime US3725563A (en) 1971-12-23 1971-12-23 Method of perspective transformation in scanned raster visual display

Country Status (2)

Country Link
US (1) US3725563A (en)
JP (1) JPS564918B2 (en)

Cited By (59)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3892051A (en) * 1973-10-31 1975-07-01 Gen Electric Simulated collimation of computer generated images
US3943344A (en) * 1973-06-30 1976-03-09 Tokyo Shibaura Electric Co., Ltd. Apparatus for measuring the elevation of a three-dimensional foreground subject
US4208719A (en) * 1978-08-10 1980-06-17 The Singer Company Edge smoothing for real-time simulation of a polygon face object system as viewed by a moving observer
US4241519A (en) * 1979-01-25 1980-12-30 The Ohio State University Research Foundation Flight simulator with spaced visuals
FR2466061A1 (en) * 1979-06-15 1981-03-27 Redifon Simulation Ltd IMPROVEMENT TO VISUALIZATION SYSTEMS OF THE IMAGE TYPE GENERATED BY COMPUTER
US4263726A (en) * 1978-04-22 1981-04-28 Redifon Simulation Limited Visual display apparatus
US4276029A (en) * 1979-05-17 1981-06-30 The Ohio State University Visual cue simulator
US4283765A (en) * 1978-04-14 1981-08-11 Tektronix, Inc. Graphics matrix multiplier
US4296930A (en) * 1975-11-26 1981-10-27 Bally Manufacturing Corporation TV Game apparatus
US4475172A (en) * 1978-05-30 1984-10-02 Bally Manufacturing Corporation Audio/visual home computer and game apparatus
US4500879A (en) * 1982-01-06 1985-02-19 Smith Engineering Circuitry for controlling a CRT beam
FR2550873A1 (en) * 1983-08-19 1985-02-22 Thomson Csf SYSTEM FOR VIEWING IMAGES RECORDED ON FILM AND USABLE FOR FLIGHT SIMULATION
US4521196A (en) * 1981-06-12 1985-06-04 Giravions Dorand Method and apparatus for formation of a fictitious target in a training unit for aiming at targets
US4827438A (en) * 1987-03-30 1989-05-02 Halliburton Company Method and apparatus related to simulating train responses to actual train operating data
US4853883A (en) * 1987-11-09 1989-08-01 Nickles Stephen K Apparatus and method for use in simulating operation and control of a railway train
US4982345A (en) * 1989-01-23 1991-01-01 International Business Machines Corporation Interactive computer graphics display system processing method for identifying an operator selected displayed object
US5247356A (en) * 1992-02-14 1993-09-21 Ciampa John A Method and apparatus for mapping and measuring land
US5253051A (en) * 1991-03-05 1993-10-12 Mcmanigal Paul G Video artificial window apparatus
US5684937A (en) * 1992-12-14 1997-11-04 Oxaal; Ford Method and apparatus for performing perspective transformation on visible stimuli
US20040004621A1 (en) * 1995-11-15 2004-01-08 Ford Oxaal Method for interactively viewing full-surround image data and apparatus therefor
US20050007479A1 (en) * 2003-05-02 2005-01-13 Yavuz Ahiska Multiple object processing in wide-angle video camera
US20050007477A1 (en) * 2003-05-02 2005-01-13 Yavuz Ahiska Correction of optical distortion by image processing
US20050007478A1 (en) * 2003-05-02 2005-01-13 Yavuz Ahiska Multiple-view processing in wide-angle video camera
US20050007453A1 (en) * 2003-05-02 2005-01-13 Yavuz Ahiska Method and system of simultaneously displaying multiple views for video surveillance
US20050028215A1 (en) * 2003-06-03 2005-02-03 Yavuz Ahiska Network camera supporting multiple IP addresses
US20060056056A1 (en) * 2004-07-19 2006-03-16 Grandeye Ltd. Automatically expanding the zoom capability of a wide-angle video camera
US20060062478A1 (en) * 2004-08-16 2006-03-23 Grandeye, Ltd., Region-sensitive compression of digital video
US20070124783A1 (en) * 2005-11-23 2007-05-31 Grandeye Ltd, Uk, Interactive wide-angle video server
US7366359B1 (en) 2004-07-08 2008-04-29 Grandeye, Ltd. Image processing of regions in a wide angle video camera
US20080123994A1 (en) * 2006-08-30 2008-05-29 Stephen Schultz Mosaic Oblique Images and Methods of Making and Using Same
US20080204570A1 (en) * 2007-02-15 2008-08-28 Stephen Schultz Event Multiplexer For Managing The Capture of Images
US20080231700A1 (en) * 2007-02-01 2008-09-25 Stephen Schultz Computer System for Continuous Oblique Panning
US20080273753A1 (en) * 2007-05-01 2008-11-06 Frank Giuffrida System for Detecting Image Abnormalities
US20090096884A1 (en) * 2002-11-08 2009-04-16 Schultz Stephen L Method and Apparatus for Capturing, Geolocating and Measuring Oblique Images
US20090097744A1 (en) * 2007-10-12 2009-04-16 Stephen Schultz System and Process for Color-Balancing a Series of Oblique Images
US20090141020A1 (en) * 2007-12-03 2009-06-04 Freund Joseph G Systems and methods for rapid three-dimensional modeling with real facade texture
US20100002071A1 (en) * 2004-04-30 2010-01-07 Grandeye Ltd. Multiple View and Multiple Object Processing in Wide-Angle Video Camera
US20100296693A1 (en) * 2009-05-22 2010-11-25 Thornberry Dale R System and process for roof measurement using aerial imagery
US7893985B1 (en) 2004-03-15 2011-02-22 Grandeye Ltd. Wide angle electronic camera with improved peripheral vision
US7894531B1 (en) 2005-02-15 2011-02-22 Grandeye Ltd. Method of compression for wide angle digital video
US20110096083A1 (en) * 2009-10-26 2011-04-28 Stephen Schultz Method for the automatic material classification and texture simulation for 3d models
US8477190B2 (en) 2010-07-07 2013-07-02 Pictometry International Corp. Real-time moving platform management system
US8588547B2 (en) 2008-08-05 2013-11-19 Pictometry International Corp. Cut-line steering methods for forming a mosaic image of a geographical area
US8823732B2 (en) 2010-12-17 2014-09-02 Pictometry International Corp. Systems and methods for processing images with edge detection and snap-to feature
US8860780B1 (en) 2004-09-27 2014-10-14 Grandeye, Ltd. Automatic pivoting in a wide-angle video camera
US9141615B1 (en) 2004-11-12 2015-09-22 Grandeye, Ltd. Interactive media server
US9183538B2 (en) 2012-03-19 2015-11-10 Pictometry International Corp. Method and system for quick square roof reporting
US9262818B2 (en) 2007-05-01 2016-02-16 Pictometry International Corp. System for detecting image abnormalities
US9275080B2 (en) 2013-03-15 2016-03-01 Pictometry International Corp. System and method for early access to captured images
US9292913B2 (en) 2014-01-31 2016-03-22 Pictometry International Corp. Augmented three dimensional point collection of vertical structures
US9602700B2 (en) 2003-05-02 2017-03-21 Grandeye Ltd. Method and system of simultaneously displaying multiple views for video surveillance
US9612598B2 (en) 2014-01-10 2017-04-04 Pictometry International Corp. Unmanned aircraft structure evaluation system and method
US9753950B2 (en) 2013-03-15 2017-09-05 Pictometry International Corp. Virtual property reporting for automatic structure detection
US9881163B2 (en) 2013-03-12 2018-01-30 Pictometry International Corp. System and method for performing sensitive geo-spatial processing in non-sensitive operator environments
US9953112B2 (en) 2014-02-08 2018-04-24 Pictometry International Corp. Method and system for displaying room interiors on a floor plan
US10325350B2 (en) 2011-06-10 2019-06-18 Pictometry International Corp. System and method for forming a video stream containing GIS data in real-time
US10402676B2 (en) 2016-02-15 2019-09-03 Pictometry International Corp. Automated system and methodology for feature extraction
US10502813B2 (en) 2013-03-12 2019-12-10 Pictometry International Corp. LiDAR system producing multiple scan paths and method of making and using same
US10671648B2 (en) 2016-02-22 2020-06-02 Eagle View Technologies, Inc. Integrated centralized property database systems and methods

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0368117U (en) * 1989-11-08 1991-07-04

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3060596A (en) * 1961-08-22 1962-10-30 Dalto Electronics Corp Electronic system for generating a perspective image
US3098929A (en) * 1959-01-02 1963-07-23 Gen Electric Electronic contact analog simulator
US3261912A (en) * 1965-04-08 1966-07-19 Gen Precision Inc Simulated viewpoint displacement apparatus

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3098929A (en) * 1959-01-02 1963-07-23 Gen Electric Electronic contact analog simulator
US3060596A (en) * 1961-08-22 1962-10-30 Dalto Electronics Corp Electronic system for generating a perspective image
US3261912A (en) * 1965-04-08 1966-07-19 Gen Precision Inc Simulated viewpoint displacement apparatus

Cited By (144)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3943344A (en) * 1973-06-30 1976-03-09 Tokyo Shibaura Electric Co., Ltd. Apparatus for measuring the elevation of a three-dimensional foreground subject
US3892051A (en) * 1973-10-31 1975-07-01 Gen Electric Simulated collimation of computer generated images
US4296930A (en) * 1975-11-26 1981-10-27 Bally Manufacturing Corporation TV Game apparatus
US4283765A (en) * 1978-04-14 1981-08-11 Tektronix, Inc. Graphics matrix multiplier
US4263726A (en) * 1978-04-22 1981-04-28 Redifon Simulation Limited Visual display apparatus
US4475172A (en) * 1978-05-30 1984-10-02 Bally Manufacturing Corporation Audio/visual home computer and game apparatus
US4208719A (en) * 1978-08-10 1980-06-17 The Singer Company Edge smoothing for real-time simulation of a polygon face object system as viewed by a moving observer
US4241519A (en) * 1979-01-25 1980-12-30 The Ohio State University Research Foundation Flight simulator with spaced visuals
US4276029A (en) * 1979-05-17 1981-06-30 The Ohio State University Visual cue simulator
FR2466061A1 (en) * 1979-06-15 1981-03-27 Redifon Simulation Ltd IMPROVEMENT TO VISUALIZATION SYSTEMS OF THE IMAGE TYPE GENERATED BY COMPUTER
US4521196A (en) * 1981-06-12 1985-06-04 Giravions Dorand Method and apparatus for formation of a fictitious target in a training unit for aiming at targets
US4500879A (en) * 1982-01-06 1985-02-19 Smith Engineering Circuitry for controlling a CRT beam
FR2550873A1 (en) * 1983-08-19 1985-02-22 Thomson Csf SYSTEM FOR VIEWING IMAGES RECORDED ON FILM AND USABLE FOR FLIGHT SIMULATION
US4827438A (en) * 1987-03-30 1989-05-02 Halliburton Company Method and apparatus related to simulating train responses to actual train operating data
US4853883A (en) * 1987-11-09 1989-08-01 Nickles Stephen K Apparatus and method for use in simulating operation and control of a railway train
US4982345A (en) * 1989-01-23 1991-01-01 International Business Machines Corporation Interactive computer graphics display system processing method for identifying an operator selected displayed object
US5253051A (en) * 1991-03-05 1993-10-12 Mcmanigal Paul G Video artificial window apparatus
US5247356A (en) * 1992-02-14 1993-09-21 Ciampa John A Method and apparatus for mapping and measuring land
US5684937A (en) * 1992-12-14 1997-11-04 Oxaal; Ford Method and apparatus for performing perspective transformation on visible stimuli
US5936630A (en) * 1992-12-14 1999-08-10 Oxaal; Ford Method of and apparatus for performing perspective transformation of visible stimuli
US7542035B2 (en) 1995-11-15 2009-06-02 Ford Oxaal Method for interactively viewing full-surround image data and apparatus therefor
US20040004621A1 (en) * 1995-11-15 2004-01-08 Ford Oxaal Method for interactively viewing full-surround image data and apparatus therefor
US9811922B2 (en) 2002-11-08 2017-11-07 Pictometry International Corp. Method and apparatus for capturing, geolocating and measuring oblique images
US20090096884A1 (en) * 2002-11-08 2009-04-16 Schultz Stephen L Method and Apparatus for Capturing, Geolocating and Measuring Oblique Images
US10607357B2 (en) 2002-11-08 2020-03-31 Pictometry International Corp. Method and apparatus for capturing, geolocating and measuring oblique images
US11069077B2 (en) 2002-11-08 2021-07-20 Pictometry International Corp. Method and apparatus for capturing, geolocating and measuring oblique images
US7995799B2 (en) 2002-11-08 2011-08-09 Pictometry International Corporation Method and apparatus for capturing geolocating and measuring oblique images
US20100302243A1 (en) * 2002-11-08 2010-12-02 Schultz Stephen L Method and apparatus for capturing geolocating and measuring oblique images
US7787659B2 (en) 2002-11-08 2010-08-31 Pictometry International Corp. Method and apparatus for capturing, geolocating and measuring oblique images
US9443305B2 (en) 2002-11-08 2016-09-13 Pictometry International Corp. Method and apparatus for capturing, geolocating and measuring oblique images
US7529424B2 (en) 2003-05-02 2009-05-05 Grandeye, Ltd. Correction of optical distortion by image processing
US20050007453A1 (en) * 2003-05-02 2005-01-13 Yavuz Ahiska Method and system of simultaneously displaying multiple views for video surveillance
US9602700B2 (en) 2003-05-02 2017-03-21 Grandeye Ltd. Method and system of simultaneously displaying multiple views for video surveillance
US20050007479A1 (en) * 2003-05-02 2005-01-13 Yavuz Ahiska Multiple object processing in wide-angle video camera
US7450165B2 (en) 2003-05-02 2008-11-11 Grandeye, Ltd. Multiple-view processing in wide-angle video camera
US20050007477A1 (en) * 2003-05-02 2005-01-13 Yavuz Ahiska Correction of optical distortion by image processing
US20050007478A1 (en) * 2003-05-02 2005-01-13 Yavuz Ahiska Multiple-view processing in wide-angle video camera
US7528881B2 (en) 2003-05-02 2009-05-05 Grandeye, Ltd. Multiple object processing in wide-angle video camera
US20050028215A1 (en) * 2003-06-03 2005-02-03 Yavuz Ahiska Network camera supporting multiple IP addresses
US7893985B1 (en) 2004-03-15 2011-02-22 Grandeye Ltd. Wide angle electronic camera with improved peripheral vision
US8427538B2 (en) 2004-04-30 2013-04-23 Oncam Grandeye Multiple view and multiple object processing in wide-angle video camera
US20100002071A1 (en) * 2004-04-30 2010-01-07 Grandeye Ltd. Multiple View and Multiple Object Processing in Wide-Angle Video Camera
US7366359B1 (en) 2004-07-08 2008-04-29 Grandeye, Ltd. Image processing of regions in a wide angle video camera
US20080211903A1 (en) * 2004-07-08 2008-09-04 Grandeye, Ltd. Image processing of regions in a wide angle video camera
US8145007B2 (en) 2004-07-08 2012-03-27 Grandeye, Ltd. Image processing of regions in a wide angle video camera
US20060056056A1 (en) * 2004-07-19 2006-03-16 Grandeye Ltd. Automatically expanding the zoom capability of a wide-angle video camera
US8405732B2 (en) 2004-07-19 2013-03-26 Grandeye, Ltd. Automatically expanding the zoom capability of a wide-angle video camera
US7990422B2 (en) 2004-07-19 2011-08-02 Grandeye, Ltd. Automatically expanding the zoom capability of a wide-angle video camera
US20060062478A1 (en) * 2004-08-16 2006-03-23 Grandeye, Ltd., Region-sensitive compression of digital video
US8860780B1 (en) 2004-09-27 2014-10-14 Grandeye, Ltd. Automatic pivoting in a wide-angle video camera
US9141615B1 (en) 2004-11-12 2015-09-22 Grandeye, Ltd. Interactive media server
US7894531B1 (en) 2005-02-15 2011-02-22 Grandeye Ltd. Method of compression for wide angle digital video
US20070124783A1 (en) * 2005-11-23 2007-05-31 Grandeye Ltd, Uk, Interactive wide-angle video server
US8723951B2 (en) 2005-11-23 2014-05-13 Grandeye, Ltd. Interactive wide-angle video server
US9805489B2 (en) 2006-08-30 2017-10-31 Pictometry International Corp. Mosaic oblique images and methods of making and using same
US11080911B2 (en) 2006-08-30 2021-08-03 Pictometry International Corp. Mosaic oblique images and systems and methods of making and using same
US20080123994A1 (en) * 2006-08-30 2008-05-29 Stephen Schultz Mosaic Oblique Images and Methods of Making and Using Same
US7873238B2 (en) 2006-08-30 2011-01-18 Pictometry International Corporation Mosaic oblique images and methods of making and using same
US9437029B2 (en) 2006-08-30 2016-09-06 Pictometry International Corp. Mosaic oblique images and methods of making and using same
US9959653B2 (en) 2006-08-30 2018-05-01 Pictometry International Corporation Mosaic oblique images and methods of making and using same
US10489953B2 (en) 2006-08-30 2019-11-26 Pictometry International Corp. Mosaic oblique images and methods of making and using same
US20080231700A1 (en) * 2007-02-01 2008-09-25 Stephen Schultz Computer System for Continuous Oblique Panning
US8593518B2 (en) 2007-02-01 2013-11-26 Pictometry International Corp. Computer system for continuous oblique panning
US8520079B2 (en) 2007-02-15 2013-08-27 Pictometry International Corp. Event multiplexer for managing the capture of images
US20080204570A1 (en) * 2007-02-15 2008-08-28 Stephen Schultz Event Multiplexer For Managing The Capture of Images
US10198803B2 (en) 2007-05-01 2019-02-05 Pictometry International Corp. System for detecting image abnormalities
US10679331B2 (en) 2007-05-01 2020-06-09 Pictometry International Corp. System for detecting image abnormalities
US8385672B2 (en) 2007-05-01 2013-02-26 Pictometry International Corp. System for detecting image abnormalities
US11514564B2 (en) 2007-05-01 2022-11-29 Pictometry International Corp. System for detecting image abnormalities
US20080273753A1 (en) * 2007-05-01 2008-11-06 Frank Giuffrida System for Detecting Image Abnormalities
US9633425B2 (en) 2007-05-01 2017-04-25 Pictometry International Corp. System for detecting image abnormalities
US11100625B2 (en) 2007-05-01 2021-08-24 Pictometry International Corp. System for detecting image abnormalities
US9262818B2 (en) 2007-05-01 2016-02-16 Pictometry International Corp. System for detecting image abnormalities
US9959609B2 (en) 2007-05-01 2018-05-01 Pictometry International Corporation System for detecting image abnormalities
US7991226B2 (en) 2007-10-12 2011-08-02 Pictometry International Corporation System and process for color-balancing a series of oblique images
US20140126816A1 (en) * 2007-10-12 2014-05-08 Pictometry International Corp. System and process for color-balancing a series of oblique images
US20090097744A1 (en) * 2007-10-12 2009-04-16 Stephen Schultz System and Process for Color-Balancing a Series of Oblique Images
US8649596B2 (en) * 2007-10-12 2014-02-11 Pictometry International Corp. System and process for color-balancing a series of oblique images
US11087506B2 (en) 2007-10-12 2021-08-10 Pictometry International Corp. System and process for color-balancing a series of oblique images
US9503615B2 (en) 2007-10-12 2016-11-22 Pictometry International Corp. System and process for color-balancing a series of oblique images
US8971624B2 (en) * 2007-10-12 2015-03-03 Pictometry International Corp. System and process for color-balancing a series of oblique images
US10580169B2 (en) 2007-10-12 2020-03-03 Pictometry International Corp. System and process for color-balancing a series of oblique images
US20120183217A1 (en) * 2007-10-12 2012-07-19 Stephen Schultz System and process for color-balancing a series of oblique images
US9275496B2 (en) 2007-12-03 2016-03-01 Pictometry International Corp. Systems and methods for rapid three-dimensional modeling with real facade texture
US9972126B2 (en) 2007-12-03 2018-05-15 Pictometry International Corp. Systems and methods for rapid three-dimensional modeling with real facade texture
US10573069B2 (en) 2007-12-03 2020-02-25 Pictometry International Corp. Systems and methods for rapid three-dimensional modeling with real facade texture
US20090141020A1 (en) * 2007-12-03 2009-06-04 Freund Joseph G Systems and methods for rapid three-dimensional modeling with real facade texture
US10229532B2 (en) 2007-12-03 2019-03-12 Pictometry International Corporation Systems and methods for rapid three-dimensional modeling with real facade texture
US9520000B2 (en) 2007-12-03 2016-12-13 Pictometry International Corp. Systems and methods for rapid three-dimensional modeling with real facade texture
US9836882B2 (en) 2007-12-03 2017-12-05 Pictometry International Corp. Systems and methods for rapid three-dimensional modeling with real facade texture
US11263808B2 (en) 2007-12-03 2022-03-01 Pictometry International Corp. Systems and methods for rapid three-dimensional modeling with real façade texture
US8531472B2 (en) 2007-12-03 2013-09-10 Pictometry International Corp. Systems and methods for rapid three-dimensional modeling with real façade texture
US10896540B2 (en) 2007-12-03 2021-01-19 Pictometry International Corp. Systems and methods for rapid three-dimensional modeling with real façade texture
US10839484B2 (en) 2008-08-05 2020-11-17 Pictometry International Corp. Cut-line steering methods for forming a mosaic image of a geographical area
US10424047B2 (en) 2008-08-05 2019-09-24 Pictometry International Corp. Cut line steering methods for forming a mosaic image of a geographical area
US11551331B2 (en) 2008-08-05 2023-01-10 Pictometry International Corp. Cut-line steering methods for forming a mosaic image of a geographical area
US8588547B2 (en) 2008-08-05 2013-11-19 Pictometry International Corp. Cut-line steering methods for forming a mosaic image of a geographical area
US9898802B2 (en) 2008-08-05 2018-02-20 Pictometry International Corp. Cut line steering methods for forming a mosaic image of a geographical area
US8401222B2 (en) 2009-05-22 2013-03-19 Pictometry International Corp. System and process for roof measurement using aerial imagery
US9933254B2 (en) 2009-05-22 2018-04-03 Pictometry International Corp. System and process for roof measurement using aerial imagery
US20100296693A1 (en) * 2009-05-22 2010-11-25 Thornberry Dale R System and process for roof measurement using aerial imagery
US9959667B2 (en) 2009-10-26 2018-05-01 Pictometry International Corp. Method for the automatic material classification and texture simulation for 3D models
US10198857B2 (en) 2009-10-26 2019-02-05 Pictometry International Corp. Method for the automatic material classification and texture simulation for 3D models
US20110096083A1 (en) * 2009-10-26 2011-04-28 Stephen Schultz Method for the automatic material classification and texture simulation for 3d models
US9330494B2 (en) 2009-10-26 2016-05-03 Pictometry International Corp. Method for the automatic material classification and texture simulation for 3D models
US8477190B2 (en) 2010-07-07 2013-07-02 Pictometry International Corp. Real-time moving platform management system
US11483518B2 (en) 2010-07-07 2022-10-25 Pictometry International Corp. Real-time moving platform management system
US11003943B2 (en) 2010-12-17 2021-05-11 Pictometry International Corp. Systems and methods for processing images with edge detection and snap-to feature
US10621463B2 (en) 2010-12-17 2020-04-14 Pictometry International Corp. Systems and methods for processing images with edge detection and snap-to feature
US8823732B2 (en) 2010-12-17 2014-09-02 Pictometry International Corp. Systems and methods for processing images with edge detection and snap-to feature
US10325350B2 (en) 2011-06-10 2019-06-18 Pictometry International Corp. System and method for forming a video stream containing GIS data in real-time
US9183538B2 (en) 2012-03-19 2015-11-10 Pictometry International Corp. Method and system for quick square roof reporting
US10346935B2 (en) 2012-03-19 2019-07-09 Pictometry International Corp. Medium and method for quick square roof reporting
US10311238B2 (en) 2013-03-12 2019-06-04 Pictometry International Corp. System and method for performing sensitive geo-spatial processing in non-sensitive operator environments
US11525897B2 (en) 2013-03-12 2022-12-13 Pictometry International Corp. LiDAR system producing multiple scan paths and method of making and using same
US9881163B2 (en) 2013-03-12 2018-01-30 Pictometry International Corp. System and method for performing sensitive geo-spatial processing in non-sensitive operator environments
US10502813B2 (en) 2013-03-12 2019-12-10 Pictometry International Corp. LiDAR system producing multiple scan paths and method of making and using same
US10311089B2 (en) 2013-03-15 2019-06-04 Pictometry International Corp. System and method for early access to captured images
US9753950B2 (en) 2013-03-15 2017-09-05 Pictometry International Corp. Virtual property reporting for automatic structure detection
US9805059B2 (en) 2013-03-15 2017-10-31 Pictometry International Corp. System and method for early access to captured images
US9275080B2 (en) 2013-03-15 2016-03-01 Pictometry International Corp. System and method for early access to captured images
US10204269B2 (en) 2014-01-10 2019-02-12 Pictometry International Corp. Unmanned aircraft obstacle avoidance
US11087131B2 (en) 2014-01-10 2021-08-10 Pictometry International Corp. Unmanned aircraft structure evaluation system and method
US9612598B2 (en) 2014-01-10 2017-04-04 Pictometry International Corp. Unmanned aircraft structure evaluation system and method
US11747486B2 (en) 2014-01-10 2023-09-05 Pictometry International Corp. Unmanned aircraft structure evaluation system and method
US10181081B2 (en) 2014-01-10 2019-01-15 Pictometry International Corp. Unmanned aircraft structure evaluation system and method
US10181080B2 (en) 2014-01-10 2019-01-15 Pictometry International Corp. Unmanned aircraft structure evaluation system and method
US10318809B2 (en) 2014-01-10 2019-06-11 Pictometry International Corp. Unmanned aircraft structure evaluation system and method
US10037464B2 (en) 2014-01-10 2018-07-31 Pictometry International Corp. Unmanned aircraft structure evaluation system and method
US10037463B2 (en) 2014-01-10 2018-07-31 Pictometry International Corp. Unmanned aircraft structure evaluation system and method
US10032078B2 (en) 2014-01-10 2018-07-24 Pictometry International Corp. Unmanned aircraft structure evaluation system and method
US11120262B2 (en) 2014-01-10 2021-09-14 Pictometry International Corp. Unmanned aircraft structure evaluation system and method
US10338222B2 (en) 2014-01-31 2019-07-02 Pictometry International Corp. Augmented three dimensional point collection of vertical structures
US9292913B2 (en) 2014-01-31 2016-03-22 Pictometry International Corp. Augmented three dimensional point collection of vertical structures
US10942276B2 (en) 2014-01-31 2021-03-09 Pictometry International Corp. Augmented three dimensional point collection of vertical structures
US9542738B2 (en) 2014-01-31 2017-01-10 Pictometry International Corp. Augmented three dimensional point collection of vertical structures
US11686849B2 (en) 2014-01-31 2023-06-27 Pictometry International Corp. Augmented three dimensional point collection of vertical structures
US10571575B2 (en) 2014-01-31 2020-02-25 Pictometry International Corp. Augmented three dimensional point collection of vertical structures
US11100259B2 (en) 2014-02-08 2021-08-24 Pictometry International Corp. Method and system for displaying room interiors on a floor plan
US9953112B2 (en) 2014-02-08 2018-04-24 Pictometry International Corp. Method and system for displaying room interiors on a floor plan
US11417081B2 (en) 2016-02-15 2022-08-16 Pictometry International Corp. Automated system and methodology for feature extraction
US10796189B2 (en) 2016-02-15 2020-10-06 Pictometry International Corp. Automated system and methodology for feature extraction
US10402676B2 (en) 2016-02-15 2019-09-03 Pictometry International Corp. Automated system and methodology for feature extraction
US10671648B2 (en) 2016-02-22 2020-06-02 Eagle View Technologies, Inc. Integrated centralized property database systems and methods

Also Published As

Publication number Publication date
JPS564918B2 (en) 1981-02-02
JPS4873234A (en) 1973-10-03

Similar Documents

Publication Publication Date Title
US3725563A (en) Method of perspective transformation in scanned raster visual display
US3659920A (en) Wide angle infinity image visual display
US4343037A (en) Visual display systems of the computer generated image type
US3418459A (en) Graphic construction display generator
US6052100A (en) Computer controlled three-dimensional volumetric display
US3439105A (en) Visual display system
US3697681A (en) Placement of image on matrix display
US4825381A (en) Moving map display
US3996673A (en) Image generating means
US3757040A (en) Wide angle display for digitally generated video information
Schachter Computer image generation for flight simulation
US3892051A (en) Simulated collimation of computer generated images
US4827252A (en) Display methods and apparatus
US4054917A (en) Synthetic terrain generators
US3539696A (en) Real-time dynamic perspective display
US4241519A (en) Flight simulator with spaced visuals
US4024539A (en) Method and apparatus for flight path control
US3787619A (en) Wide angle display system
US5550959A (en) Technique and system for the real-time generation of perspective images
GB2061074A (en) Improved visual display systems for computer generated images
EP0499374A1 (en) Simulator image perspective alteration optics
US3247317A (en) Satellite visual simulator
US3611590A (en) Visual system computer
EP0250588B1 (en) Comprehensive distortion correction in a real time imaging system
US4111536A (en) Automatic registration of projector images

Legal Events

Date Code Title Description
AS Assignment

Owner name: LINK FLIGHT SIMULATION CORPORATION, KIRKWOOD INDUS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST.;ASSIGNOR:SINGER COMPANY, THE, A NJ CORP.;REEL/FRAME:004998/0190

Effective date: 19880425

AS Assignment

Owner name: CAE-LINK CORPORATION, A CORP. OF DE.

Free format text: MERGER;ASSIGNORS:LINK FLIGHT SIMULATION CORPORATION, A DE CORP.;LINK FACTICAL MILITARY SIMULATION CORPORATION, A CORP. OF DE;LINK TRAINING SERVICES CORPORATION, A CORP. OF DE (MERGED INTO);AND OTHERS;REEL/FRAME:005252/0187

Effective date: 19881130