|Wikimedia Commons has media related to Telepresence.|
Telepresence refers to a set of technologies which allow a person to feel as if they were present, to give the appearance of being present, or to have an effect, via telerobotics, at a place other than their true location.
Telepresence requires that the users' senses be provided with such stimuli as to give the feeling of being in that other location. Additionally, users may be given the ability to affect the remote location. In this case, the user's position, movements, actions, voice, etc. may be sensed, transmitted and duplicated in the remote location to bring about this effect. Therefore information may be traveling in both directions between the user and the remote location.
A popular application is found in telepresence videoconferencing, the highest possible level of videotelephony. Telepresence via video deploys greater technical sophistication and improved fidelity of both sight and sound than in traditional videoconferencing. Technical advancements in mobile collaboration have also extended the capabilities of videoconferencing beyond the boardroom for use with hand-held mobile devices, enabling collaboration independent of location.
In a pioneering paper, the U.S. cognitive scientist Marvin Minsky attributed the development of the idea of telepresence to science fiction author Robert A. Heinlein: "My first vision of a remote-controlled economy came from Robert A. Heinlein's prophetic 1948 [sic] novel, Waldo," wrote Minsky. In his science fiction short story "Waldo" (1942), Heinlein first proposed a primitive telepresence master-slave manipulator system.
The Brother Assassin, written by Fred Saberhagen in 1969, introduced the complete concept for a telepresence master-slave humanoid system. In the novel, the concept is described as follows: "And a moment later it seemed to all his senses that he had been transported from the master down into the body of the slave-unit standing beneath it on the floor. As the control of its movements passed over to him, the slave started gradually to lean to one side, and he moved its foot to maintain balance as naturally as he moved his own. Tilting back his head, he could look up through the slave's eyes to see the master-unit, with himself inside, maintaining the same attitude on its complex suspension."
The term telepresence was coined in a 1980 article by Minsky, who outlined his vision for an adapted version of the older concept of teleoperation that focused on giving a remote participant a feeling of actually being present at a different location. One of the first systems to create a fully immersive illusion of presence in a remote location was the Virtual Fixtures platform developed in 1992 at the U.S. Air Force, Armstrong Labs by inventor Louis Rosenberg. The system included stereoscopic image display from the remote environment as well as immersive touch feedback using a full upper-body exoskeleton.
The first commercially successful telepresence company, Teleport (which was later renamed TeleSuite), was founded in 1993 by David Allen and Herold Williams. Before TeleSuite, they ran a resort business from which the original concept emerged, because they often found businesspeople would have to cut their stays short to participate in important meetings. Their idea was to develop a technology that would allow businesspeople to attend their meetings without leaving the resorts so that they could lengthen their hotel stays.
Hilton Hotels had originally licensed to install them in their hotels throughout the United States and other countries, but use was low. The idea lost momentum, with Hilton eventually backing out. TeleSuite later began to focus less on the hospitality industry and more on business-oriented telepresence systems. Shareholders eventually held enough stock to replace the company's original leadership, which ultimately led to its collapse. David Allen purchased all of the assets of TeleSuite and appointed Scott Allen as president  of the new company called Destiny Conferencing.
Destiny Conferencing licensed its patent portfolio to HP which became the first large company to join the telepresence industry, soon followed by others such as Cisco and Polycom. After forming a distribution agreement with Pleasanton-based Polycom, Destiny Conferencing sold on January 5, 2007 to Polycom for $60 million.
An important research project in telepresence began in 1990. Located at the University of Toronto, the Ontario Telepresence Project (OTP) was an interdisciplinary effort involving social sciences and engineering. Its final report stated that it "...was a three year, $4.8 million pre-competitive research project whose mandate was to design and field trial advanced media space systems in a variety of workplaces in order to gain insights into key sociological and engineering issues. The OTP, which ended in December 1994, was part of the International Telepresence Project which linked Ontario researchers to their counterparts in four European nations. The Project’s major sponsor was the Province of Ontario, through two of its Centres of Excellence—the Information Technology Research Centre (ITRC) and the Telecommunications Research Institute of Ontario (TRIO)." 
An industry expert described some benefits of telepresence: "There were four drivers for our decision to do more business over video and telepresence. We wanted to reduce our travel spend, reduce our carbon footprint and environmental impact, improve our employees' work/life balance, and improve employee productivity.".
Rather than traveling great distances in order to have a face-face meeting, it is now commonplace to instead use a telepresence system, which uses a multiple codec video system (which is what the word "telepresence" most currently represents). Each member/party of the meeting uses a telepresence room to "dial in" and can see/talk to every other member on a screen/screens as if they were in the same room. This brings enormous time and cost benefits. It is also superior to phone conferencing (except in cost), as the visual aspect greatly enhances communications, allowing for perceptions of facial expressions and other body language.
Mobile collaboration systems combine the use of video, audio and on-screen drawing capabilities using newest generation hand-held mobile devices to enable multi-party conferencing in real-time, independent of location. Benefits include cost-efficiencies resulting from accelerated problem resolution, reductions in downtimes and travel, improvements in customer service and increased productivity.
Telepresence has been described as the human experience of being fully present at a live real-world location remote from one's own physical location. Someone experiencing video telepresence would therefore be able to behave, and receive stimuli, as though part of a meeting at the remote site. The aforementioned would result in interactive participation of group activities that would bring benefits to a wide range of users.
To provide a telepresence experience, technologies are required that implement the human sensory elements of vision, sound, and manipulation.
A minimum system usually includes visual feedback. Ideally, the entire field of view of the user is filled with a view of the remote location, and the viewpoint corresponds to the movement and orientation of the user's head. In this way, it differs from television or cinema, where the viewpoint is out of the control of the viewer.
In order to achieve this, the user may be provided with either a very large (or wraparound) screen, or small displays mounted directly in front of the eyes. The latter provides a particularly convincing 3D sensation. The movements of the user's head must be sensed, and the camera must mimic those movements accurately and in real time. This is important to prevent unintended motion sickness.
Sound is generally the easiest sensation to implement with high fidelity, based on the foundational telephone technology dating back more than 130 years. Very high-fidelity sound equipment has also been available for a considerable period of time, with stereophonic sound being more convincing than monaural sound.
The ability to manipulate a remote object or environment is an important aspect for some telepresence users, and can be implemented in large number of ways depending on the needs of the user. Typically, the movements of the user's hands (position in space, and posture of the fingers) are sensed by wired gloves, inertial sensors, or absolute spatial position sensors. A robot in the remote location then copies those movements as closely as possible. This ability is also known as teleoperation.
The more closely the robot re-creates the form factor of the human hand, the greater the sense of telepresence. Complexity of robotic effectors varies greatly, from simple one axis grippers, to fully anthropomorphic robot hands.
Haptic teleoperation refers to a system that provides some sort of tactile force feedback to the user, so the user feels some approximation of the weight, firmness, size, and/or texture of the remote objects manipulated by the robot.
The prevalence of high quality video conferencing using mobile devices, tablets and portable computers has enabled a drastic growth in telepresence robots to help give a better sense of remote physical presence for communication and collaboration in the office, home or school when one cannot be there in person. The robot avatar can move or look around at the command of the remote person. Drivable telepresence robots – typically contain a display (integrated or separate phone or tablet) mounted on a roaming base. Some examples of roaming telepresence robots include Beam by Suitable Technologies, Double by Double Robotics, RP-Vita by iRobot, Anybots, Vgo, TeleMe by Mantarobot, and Romo by Romotive.
More modern roaming telepresence robots may include an ability to operate autonomously. The robots can map out the space and be able to avoid obstacles while driving themselves between rooms and their docking stations.
Telepresence's effectiveness varies by degree of fidelity. Research has noted that telepresence solutions differ in degree of implementation, from "immersive" through "adaptive" to "lite" solutions. At the top are immersive solutions where the environments at both ends are highly controlled (and often the same) with respect to lighting, acoustics, decor and furniture, thereby giving all the participants the impression they are together at the same table in the same room, thus engendering the "immersive" label.
Adaptive telepresence solutions may use the same technology, but the environments at both ends are not highly controlled and hence often differ. Adaptive solutions differ from telepresence lite solutions not in terms of control of environments, but in terms of integration of technology. Adaptive solutions use a managed service, whereas telepresence lite solutions use components that someone must integrate.
A good telepresence strategy puts the human factors first, focusing on visual collaboration configurations that closely replicate the brain's innate preferences for interpersonal communications, separating from the unnatural "talking heads" experience of traditional videoconferencing. These cues include life–size participants, fluid motion, accurate flesh tones and the appearance of true eye contact. This is already a well-established technology, used by many businesses today. The chief executive officer of Cisco Systems, John Chambers in June 2006 at the Networkers Conference compared telepresence to teleporting from Star Trek, and said that he saw the technology as a potential billion dollar market for Cisco.
Rarely will a telepresence system provide such a transparent implementation with such comprehensive and convincing stimuli that the user perceives no differences from actual presence. But the user may set aside such differences, depending on the application.
The fairly simple telephone achieves a limited form of telepresence using just the human sensory element of hearing, in that users consider themselves to be talking to each other rather than talking to the telephone itself.
Watching television, for example, although it stimulates our primary senses of vision and hearing, rarely gives the impression that the watcher is no longer at home. However, television sometimes engages the senses sufficiently to trigger emotional responses from viewers somewhat like those experienced by people who directly witness or experience events. Televised depictions of sports events, or disasters such as the September 11 terrorist attacks, can elicit strong emotions from viewers.
As the screen size increases, so does the sense of immersion, as well as the range of subjective mental experiences available to viewers. Some viewers have reported a sensation of genuine vertigo or motion sickness while watching IMAX movies of flying or outdoor sequences.
Because most currently feasible telepresence gear leaves something to be desired; the user must suspend disbelief to some degree, and choose to act in a natural way, appropriate to the remote location, perhaps using some skill to operate the equipment. In contrast, a telephone user does not see herself as "operating" the telephone, but merely talking to another person with it.
Telepresence refers to a user interacting with another live, real place, and is distinct from virtual presence, where the user is given the impression of being in a simulated environment. Telepresence and virtual presence rely on similar user-interface equipment, and they share the common feature that the relevant portions of the user's experience at some point in the process will be transmitted in an abstract (usually digital) representation. The main functional difference is the entity on the other end: a real environment in the case of telepresence, vs. a computer in the case of immersive virtual reality.
The Cooperative web or Co-Web refers to a browser-based platform that promises to replicate the power of face-to-face communications via web-touch without sacrificing the quality of human interactions, using the human sensory elements of vision, sound and manipulation.
Application examples could be cited within emergency management and security services, B&I, and the entertainment and education industries.
Telepresence can be used to establish a sense of shared presence or shared space among geographically separated members of a group.
Many other applications in situations where humans are exposed to hazardous situations are readily recognised as suitable candidates for telepresence. Mining, bomb disposal, military operations, rescue of victims from fire, toxic atmospheres, deep sea exploration, or even hostage situations, are some examples. Telepresence also plays a critical role in the exploration of other worlds, such as with the Mars Exploration Rovers, which are teleoperated from Earth.
Small diameter pipes otherwise inaccessible for examination can now be viewed using pipeline video inspection.
The possibility of being able to project the knowledge and the physical skill of a surgeon over long distances has many attractions. Thus, again there is considerable research underway in the subject. (Locally controlled robots are currently being used for joint replacement surgery as they are more precise in milling bone to receive the joints.) The armed forces have an obvious interest since the combination of telepresence, teleoperation, and telerobotics can potentially save the lives of battle casualties by allowing them prompt attention in mobile operating theatres by remote surgeons.
Recently, teleconferencing has been used in medicine (telemedicine or telematics), mainly employing audio-visual exchange, for the performance of real time remote surgical operations – as demonstrated in Regensburg, Germany in 2002. In addition to audio-visual data, the transfer of haptic (tactile) information has also been demonstrated in telemedicine.
Research has been conducted on the use of telepresence to provide professional development to teachers. Research has shown that one of the most effective forms of teacher professional development is coaching, or cognitive apprenticeship. The application of telepresence shows promise for making this approach to teacher professional development practical.
The benefits of enabling schoolchildren to take an active part in exploration have also been shown by the JASON and the NASA Ames Research Center programs. The ability of a pupil, student, or researcher to explore an otherwise inaccessible location is a very attractive proposition; For example, locations where the passage of too many people is harming the immediate environment or the artifacts themselves, e.g. undersea exploration of coral reefs, ancient Egyptian tombs, and more recent works of art.
Another application is for remote classroom which allows a professor to interact with students in multiple campuses to teach the same class simultaneously. An example of this application is in classrooms of the law schools of Rutgers University. Two identical rooms are located in two metropolitan areas. Each classroom is equipped with studio lighting, audio, and video conference equipment connected to a 200-inch monitor on the wall that students face to give an impression that they are all in the same classroom. This allows professors to be on either campus and facilitates the interaction among students in both campuses during the classes.
In 1998, Diller and Scofidio created the "Refresh", an Internet-based art installation that juxtaposed a live web camera with recorded videos staged by professional actors. Each image was accompanied with a fictional narrative which made it difficult to distinguish which was the live web camera.
In 1993, Eduardo Kac and Ed Bennett created a telepresence installation "Ornitorrinco on the Moon", for the international telecommunication arts festival "Blurred Boundaries" (Entgrenzte Grenzen II). It was coordinated by Kulturdata, in Graz, Austria, and was connected around the world.
From 1997 to the present Ghislaine Boddington of shinkansen and body>data>space has explored, in a collaboration process she has called The Weave using performing arts techniques, the extended use of telepresence into festivals, arts centres and clubs and has directed numerous workshops leading to exploration of telepresence by many artists worldwide. This methodology has been used extensively to develop skills in tele-intuition for young people in preparation for the future world of work through the body>data>space / NESTA project "Robots and Avatars" an innovative project explores how young people will work and play with new representational forms of themselves and others in virtual and physical life in the next 10–15 years.
An overview of tele-presence in dance and theatre through the last 20 years is given in Excited Atoms a research document by Judith Staines (2009) which one can download from the On The Move website
Marvin Minsky was one of the pioneers of intelligence-based mechanical robotics and telepresence. He designed and built some of the first mechanical hands with tactile sensors, visual scanners, and their software and computer interfaces. He also influenced many robotic projects outside of MIT, and designed and built the first LOGO "turtle."
Telepresence is represented in media and entertainment.
|This article lacks ISBNs for the books listed in it. (July 2010)|
|Wikimedia Commons has media related to Videoconferencing telepresence.|
None of the audio/visual content is hosted on this site. All media is embedded from other sites such as GoogleVideo, Wikipedia, YouTube etc. Therefore, this site has no control over the copyright issues of the streaming media.
All issues concerning copyright violations should be aimed at the sites hosting the material. This site does not host any of the streaming media and the owner has not uploaded any of the material to the video hosting servers. Anyone can find the same content on Google Video or YouTube by themselves.
The owner of this site cannot know which documentaries are in public domain, which has been uploaded to e.g. YouTube by the owner and which has been uploaded without permission. The copyright owner must contact the source if he wants his material off the Internet completely.