||This article possibly contains original research. (October 2009) (Learn how and when to remove this template message)|
A head-up display or heads-up display, also known as a HUD, is any transparent display that presents data without requiring users to look away from their usual viewpoints. The origin of the name stems from a pilot being able to view information with the head positioned "up" and looking forward, instead of angled down looking at lower instruments. A HUD also has the advantage that the pilot's eyes do not need to refocus to view the outside after looking at the optically nearer instruments.
Although they were initially developed for military aviation, HUDs are now used in commercial aircraft, automobiles and other, mostly professional applications.
A typical HUD contains three primary components: a projector unit, a combiner, and a video generation computer.
The projection unit in a typical HUD is an optical collimator setup: a convex lens or concave mirror with a Cathode Ray Tube, light emitting diode, or liquid crystal display at its focus. This setup (a design that has been around since the invention of the reflector sight in 1900) produces an image where the light is collimated, i.e. the focal point is perceived to be at infinity.
The combiner is typically an angled flat piece of glass (a beam splitter) located directly in front of the viewer, that redirects the projected image from projector in such a way as to see the field of view and the projected infinity image at the same time. Combiners may have special coatings that reflect the monochromatic light projected onto it from the projector unit while allowing all other wavelengths of light to pass through. In some optical layouts combiners may also have a curved surface to refocus the image from the projector.
The computer provides the interface between the HUD (i.e. the projection unit) and the systems/data to be displayed and generates the imagery and symbology to be displayed by the projection unit .
Other than fixed mounted HUD, there are also head-mounted displays (HMDs). Including helmet mounted displays (both abbreviated HMD), forms of HUD that features a display element that moves with the orientation of the user's head.
Many modern fighters (such as the F/A-18, F-16 and Eurofighter) use both a HUD and HMD concurrently. The F-35 Lightning II was designed without a HUD, relying solely on the HMD, making it the first modern military fighter not to have a fixed HUD.
HUDs are split into four generations reflecting the technology used to generate the images.
Newer micro-display imaging technologies are being introduced, including liquid crystal display (LCD), liquid crystal on silicon (LCoS), digital micro-mirrors (DMD), and organic light-emitting diode (OLED).
HUDs evolved from the reflector sight, a pre-World War II parallax-free optical sight technology for military fighter aircraft. The gyro gunsight added a reticle that moved based on the speed and turn rate to solve the amount of lead needed to hit a target while maneuvering.
During the early 1940s, the Telecommunications Research Establishment (TRE), in charge of UK radar development, found that Royal Air Force (RAF) night fighter pilots were having a hard time reacting to the verbal instruction of the radar operator as they approached their targets. They experimented with the addition of a second radar display for the pilot, but found they had trouble looking up from the lit screen into the dark sky in order to find the target. In October 1942 they had successfully combined the image from the radar tube with a projection from their standard GGS Mk. II gyro gunsight on a flat area of the windscreen. A key upgrade was the move from the original AI Mk. IV radar to the microwave-frequency AI Mk. VII radar found on the de Havilland Mosquito night fighter. This set produced an artificial horizon that further eased head-up flying.
In 1955 the US Navy's Office of Naval Research and Development did some research with a mockup HUD concept unit along with a sidestick controller in an attempt to ease the pilot's burden flying modern jet aircraft and make the instrumentation less complicated during flight. While their research was never incorporated in any aircraft of that time, the crude HUD mockup they built had all the features of today's modern HUD units.
HUD technology was next advanced by the Royal Navy in the Buccaneer, the prototype of which first flew on 30 April 1958. The aircraft's design called for an attack sight that would provide navigation and weapon release information for the low level attack mode. There was fierce competition between supporters of the new HUD design and supporters of the old electro-mechanical gunsight, with the HUD being described as a radical, even foolhardy option. The Air Arm branch of the Ministry of Defence sponsored the development of a Strike Sight. The Royal Aircraft Establishment (RAE) designed the equipment, it was built by Cintel, and the system was first integrated in 1958. The Cintel HUD business was taken over by Elliott Flight Automation and the Buccaneer HUD was manufactured and further developed, continuing up to a Mark III version with a total of 375 systems made; it was given a 'fit and forget' title by the Royal Navy and it was still in service nearly 25 years later. BAE Systems thus has a claim to the world's first Head Up Display in operational service. The earliest usage of the term "head-up-display" can be traced to the Royal Aircraft Establishment's work during this time.
In the United Kingdom, it was soon noted that pilots flying with the new gun-sights were becoming better at piloting their aircraft. At this point, the HUD expanded its purpose beyond weapon aiming to general piloting. In the 1960s, French test-pilot Gilbert Klopfstein created the first modern HUD and a standardized system of HUD symbols so that pilots would only have to learn one system and could more easily transition between aircraft. The modern HUD used in instrument flight rules approaches to landing was developed in 1975. Klopfstein pioneered HUD technology in military fighter jets and helicopters, aiming to centralize critical flight data within the pilot's field of vision. This approach sought to increase the pilot's scan efficiency and reduce "task saturation" and information overload.
Use of HUDs then expanded beyond military aircraft. In the 1970s, the HUD was introduced to commercial aviation, and in 1988, the Oldsmobile Cutlass Supreme became the first production car with a head-up display.
Until a few years ago, the Embraer 190, Saab 2000, Boeing 727, Boeing 737-300, 400, 500 and Boeing 737 New Generation Aircraft (737-600,700,800, and 900 series) were the only commercial passenger aircraft available with HUDs. However, the technology is becoming more common with aircraft such as the Canadair RJ, Airbus A318 and several business jets featuring the displays. HUDs have become standard equipment on the Boeing 787. Furthermore, the Airbus A320, A330, A340 and A380 families are currently undergoing the certification process for a HUD. HUDs were also added to the Space Shuttle orbiter.
There are several factors that interplay in the design of a HUD:
On aircraft avionics systems, HUDs typically operate from dual independent redundant computer systems. They receive input directly from the sensors (pitot-static, gyroscopic, navigation, etc.) aboard the aircraft and perform their own computations rather than receiving previously computed data from the flight computers. On other aircraft (the Boeing 787, for example) the HUD guidance computation for Low Visibility Take-off (LVTO) and low visibility approach comes from the same flight guidance computer that drives the autopilot. Computers are integrated with the aircraft's systems and allow connectivity onto several different data buses such as the ARINC 429, ARINC 629, and MIL-STD-1553.
Other symbols and data are also available in some HUDs:
Since being introduced on HUDs, both the FPV and acceleration symbols are becoming standard on head-down displays (HDD). The actual form of the FPV symbol on an HDD is not standardized but is usually a simple aircraft drawing, such as a circle with two short angled lines, (180 ± 30 degrees) and "wings" on the ends of the descending line. Keeping the FPV on the horizon allows the pilot to fly level turns in various angles of bank.
In addition to the generic information described above, military applications include weapons system and sensor data such as:
During the 1980s, the military tested the use of HUDs in vertical take off and landings (VTOL) and short take off and landing (STOL) aircraft. A HUD format was developed at NASA Ames Research Center to provide pilots of V/STOL aircraft with complete flight guidance and control information for Category III C terminal-area flight operations. This includes a large variety of flight operations, from STOL flights on land-based runways to VTOL operations on aircraft carriers. The principal features of this display format are the integration of the flightpath and pursuit guidance information into a narrow field of view, easily assimilated by the pilot with a single glance, and the superposition of vertical and horizontal situation information. The display is a derivative of a successful design developed for conventional transport aircraft.
The use of head-up displays allows commercial aircraft substantial flexibility in their operations. Systems have been approved which allow reduced-visibility takeoffs, and landings, as well as full Category III A landings and roll-outs. Studies have shown that the use of a HUD during landings decreases the lateral deviation from centerline in all landing conditions, although the touchdown point along the centerline is not changed.
In more advanced systems, such as the US Federal Aviation Administration (FAA)-labeled 'Enhanced Flight Vision System', a real-world visual image can be overlaid onto the combiner. Typically an infrared camera (either single or multi-band) is installed in the nose of the aircraft to display a conformed image to the pilot. 'EVS Enhanced Vision System' is an industry accepted term which the FAA decided not to use because "the FAA believes [it] could be confused with the system definition and operational concept found in 91.175(l) and (m)" In one EVS installation, the camera is actually installed at the top of the vertical stabilizer rather than "as close as practical to the pilots eye position". When used with a HUD however, the camera must be mounted as close as possible to the pilots eye point as the image is expected to "overlay" the real world as the pilot looks through the combiner.
"Registration," or the accurate overlay of the EVS image with the real world image, is one feature closely examined by authorities prior to approval of a HUD based EVS. This is because of the importance of the HUD matching the real world.
While the EVS display can greatly help, the FAA has only relaxed operating regulations so an aircraft with EVS can perform a CATEGORY I approach to CATEGORY II minimums. In all other cases the flight crew must comply with all "unaided" visual restrictions. (For example, if the runway visibility is restricted because of fog, even though EVS may provide a clear visual image it is not appropriate (or legal) to maneuver the aircraft using only the EVS below 100 feet above ground level.)
HUD systems are also being designed to display a synthetic vision system (SVS) graphic image, which uses high precision navigation, attitude, altitude and terrain databases to create realistic and intuitive views of the outside world.
In the 1st SVS head down image shown on the right, immediately visible indicators include the airspeed tape on the left, altitude tape on the right, and turn/bank/slip/skid displays at the top center. The boresight symbol (-v-) is in the center and directly below that is the flight path vector (FPV) symbol (the circle with short wings and a vertical stabilizer). The horizon line is visible running across the display with a break at the center, and directly to the left are numbers at ±10 degrees with a short line at ±5 degrees (the +5 degree line is easier to see) which, along with the horizon line, show the pitch of the aircraft. Unlike this color depiction of SVS on a head down primary flight display, the SVS displayed on a HUD is monochrome – that is, typically, in shades of green.
The image indicates a wings level aircraft (i.e. the flight path vector symbol is flat relative to the horizon line and there is zero roll on the turn/bank indicator). Airspeed is 140 knots, altitude is 9,450 feet, heading is 343 degrees (the number below the turn/bank indicator). Close inspection of the image shows a small purple circle which is displaced from the flight path vector slightly to the lower right. This is the guidance cue coming from the Flight Guidance System. When stabilized on the approach, this purple symbol should be centered within the FPV.
The terrain is entirely computer generated from a high resolution terrain database.
In some systems, the SVS will calculate the aircraft's current flight path, or possible flight path (based on an aircraft performance model, the aircraft's current energy, and surrounding terrain) and then turn any obstructions red to alert the flight crew. Such a system might have helped prevent the crash of American Airlines Flight 965 into a mountain in December 1995.
On the left side of the display is an SVS-unique symbol, with the appearance of a purple, diminishing sideways ladder, and which continues on the right of the display. The two lines define a "tunnel in the sky". This symbol defines the desired trajectory of the aircraft in three dimensions. For example, if the pilot had selected an airport to the left, then this symbol would curve off to the left and down. If the pilot keeps the flight path vector alongside the trajectory symbol, the craft will fly the optimum path. This path would be based on information stored in the Flight Management System's database and would show the FAA-approved approach for that airport.
The tunnel in the sky can also greatly assist the pilot when more precise four-dimensional flying is required, such as the decreased vertical or horizontal clearance requirements of Required Navigation Performance (RNP). Under such conditions the pilot is given a graphical depiction of where the aircraft should be and where it should be going rather than the pilot having to mentally integrate altitude, airspeed, heading, energy and longitude and latitude to correctly fly the aircraft.
These displays are becoming increasingly available in production cars, and usually offer speedometer, tachometer, and navigation system displays. Night vision information is also displayed via HUD on certain automobiles.
Add-on HUD systems also exist, projecting the display onto a glass combiner mounted above or below the windshield.
In 2012 Pioneer Corporation introduced a HUD navigation system that replaces the driver side sun visor and visually overlays animations of conditions ahead; a form of augmented reality (AR). Developed by Pioneer Corporation, AR-HUD became the first aftermarket automotive Head-Up Display to use a direct-to-eye laser beam scanning method, also known as virtual retinal display (VRD). AR-HUD's core technology involves a miniature laser beam scanning display developed by MicroVision, Inc..
HUDs have been proposed or are being experimentally developed for a number of other applications. In the military, a HUD can be used to overlay tactical information such as the output of a laser rangefinder or squadmate locations to infantrymen. A prototype HUD has also been developed that displays information on the inside of a swimmer's goggles or of a scuba diver's mask. A group of Electrical Engineering students from the University of Massachusetts Amherst are integrating technologies in order to develop an affordable Personal Head-Up Display. One such design is a HUD in skiing goggles. HUD systems that project information directly onto the wearer's retina with a low-powered laser (virtual retinal display) are also in experimentation.
|Wikimedia Commons has media related to Head-up displays.|