Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
WEARABLE DEVICE FOR THE USE OF AUGMENTED REALITY
Document Type and Number:
WIPO Patent Application WO/2021/148993
Kind Code:
A2
Abstract:
A system (10) for the fruition of augmented reality by a user comprising a wearable device (100) arranged to project a virtual image overlapped to an image of a real scenario comprising a plurality characteristic points p ij , said wearable device (100) comprising a frame (110) arranged to allow the user to wear the wearable device (100), said frame (110) defining a centre of projection C p . The system (10) also comprises at least one see-through display (120), at least partially transparent, comprising a micro display (121) arranged to emit the virtual image, an ocular lens (122) having a focal distance f 1, an optical combiner (123) arranged to project the virtual image emitted by the micro display (121) in front of at least one eye of the user, said virtual image being focused by the user in a plane Π V at a distance V with respect to the centre of projection C p . The system (10) then comprises a tracking device arranged to carry out a localization of the characteristic points p ij and of the wearable device (100), said characteristic points p ij arranged on characteristic planes Π i located at distances d i with respect to the centre of projection C p . The system also comprises a control unit arranged to receive the localization of the characteristic points p ij and of the wearable device (100) and to operate said or each see-through display (120) for consistently overlapping the virtual image to the image of the real scenario.

Inventors:
FERRARI VINCENZO (IT)
CUTOLO FABRIZIO (IT)
CATTARI NADIA (IT)
FONTANA UMBERTO (IT)
Application Number:
PCT/IB2021/050483
Publication Date:
July 29, 2021
Filing Date:
January 22, 2021
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
UNIV PISA (IT)
International Classes:
G02B27/00; G02B27/01; G06T19/00
Attorney, Agent or Firm:
CELESTINO, Marco (IT)
Download PDF:
Claims:
CLAIMS

1. A system (10) for the fruition of augmented reality by a user comprising:

— a wearable device (100) arranged to project a virtual image overlapped to an image of a real scenario comprising a plurality of characteristic points pij, said wearable device (100) comprising:

— a frame (110) arranged to allow said user to wear said wearable device (100), said frame (110) defining a centre of projection Cp as ideal positioning of the nodal point of the eye of said user for having a consistent overlapping between said virtual image and said image of the real scenario, said centre of projection Cp having a known position with respect to a reference system S integral to said frame (110);

— at least one see-through display (120), at least partially transparent, comprising:

— a micro display (121) arranged to emit said virtual image;

— an ocular lens (122) having a focal distance ƒ1;

— an optical combiner (123) arranged to project said virtual image emitted by said micro display (121) in front of at least one eye of said user, said virtual image being focused by said user in a plane Πv at a distance V with respect to said centre of projection Cp;

— a tracking device arranged to carry out a localization of said characteristic points pij and of said wearable device (100) with respect to said reference system S, said characteristic points pij arranged on characteristic planes Πi located at distances di with respect to said centre of projection Cp;

— a control unit arranged to receive, from said tracking device, said localization of said characteristic points pij and of said wearable device (100) with respect to said reference system S and to operate said or each see-through display (120) for consistently overlapping said virtual image to said image of said real scenario at said centre of projection Cp; said system (10) characterized in that said wearable device (100) also comprises at least one positive lens

(130) having a scale factor A and a focal distance ƒ2 and arranged, on the opposite side with respect to said eyes of said user, in a plane ΠR at a distance R with respect to said centre of projection Cp, each characteristic point pij located in a characteristic plane Πi at a distance with respect to said centre of projection Cp being focused on said plane Πv together with said virtual image whatever is the position of the eye of said user with respect to the centre of projection Cp .

2. The system (10) for the fruition of augmented reality by a user, according to claim 1, wherein said ocular lens (122) is arranged substantially at a distance equal to said focal distance with respect to said micro display (121), in such a way that said user focuses said virtual image at the infinite and that said characteristic points pij of said real scenario located at a distance equal to said focal distance ƒ2 with respect to said positive lens (130) are also focused at the infinite with respect to said user.

3. The system (10) for the fruition of augmented reality by a user, according to claim 1, wherein said tracking device comprises at least one external camera (115).

4 . The system (10) for the fruition of augmented reality by a user, according to claim 1, wherein said tracking device carries out said localization of said characteristic points pij by means of a technology selected from the group consisting of:

— optical tracking by monoscopic video camera; — optical tracking by multiscopic video camera; optical tracking by mono or multiscopic video camera with projection of light patterns; electromagnetic tracking; inertial tracking; time-of-flight tracking; a combining the previous.

5 . The system (10) for the fruition of augmented reality by a user, according to claim 1, wherein said positive lens (130) has a suitably variable focal distance f2 . 6. The system (10) for the fruition of augmented reality by a user, according to claim 1, wherein said positive lens (130) is a liquid lens based on glycerol.

7 . The system (10) for the fruition of augmented reality by a user, according to claim 1, wherein an adjustment means is provided arranged to change the relative position between said positive lens (130) and said centre of projection Cp .

8. The system (10) for the fruition of augmented reality by a user, according to claim 1, wherein said control unit is adapted to carry out a preliminary calibration comprising a step of compensating the scale factor A .

9. The system (10) for the fruition of augmented reality by a user, according to claim 1, wherein said control unit is adapted to define a maximum error of overlapping said virtual image to said image of said real scenario for different characteristic points pi .

10. The system (10) for the fruition of augmented reality by a user, according to claim 1, wherein said control unit is adapted to carry out a manual calibration step that, by means of said tracking device, allows to identify the position of an eye of said user with respect to said centre of projection Cp .

11. The system (10) for the fruition of augmented reality by a user, according to claim 1, wherein an eye-tracking system is also provided arranged to identify the movements of the eye of said user with respect to said frame (110).

12. A method for calibrating a system (10) for the fruition of augmented reality by a user, said system (10) comprising a wearable device (100) arranged to project a virtual image overlapped to an image of a real scenario comprising a plurality of characteristic points pij, said wearable device (100) comprising: — a frame (110) arranged to allow said user to wear said wearable device (100);

— at least one see-through display (120), at least partially transparent, comprising: a micro display (121) arranged to emit said virtual image; — an ocular lens (122) having a focal distance ƒ1,

— an optical combiner (123) arranged to project said virtual image emitted by said micro display (121) in front of at least one eye of said user; said method arranged to determine the position of a centre of projection Cp with respect to a reference system S integral to said frame (110), said centre of projection Cp defined as ideal positioning of the nodal point of the eye of said user for having a consistent overlapping between said virtual image and said image of the real scenario, said virtual image being focused by said user in a plane Πv at a distance V with respect to said centre of projection Cp, said method comprising the steps of:

— arranging a calibration chamber, in order to have a centre of projection Cc known with respect to said reference system S integral to said frame (110), said calibration chamber being pointed towards said see-through display (120);

— projecting by said see-through display (120) an image comprising a plurality of characteristic calibration points; acquiring by said calibration chamber said image projected by said see-through display (120) and identifying in said acquired image said characteristic calibration points;

— comparing said image acquired by said calibration chamber with said image projected by said see- through display (120) for determining a transformation matrix which transforms said plurality of characteristic points of said projected image in said plurality of characteristic points identified in said acquired image;

— starting from said transformation matrix, computing a three-dimensional translation vector arranged to define a spatial distance between said centre of projection Cc and said centre of projection Cp, with subsequent spatial identification of said centre of projection Cp with respect to said reference system 5.

13 . A system (10) for the fruition of augmented reality by a user comprising:

— a wearable device (100) arranged to project a virtual image overlapped to an image of a real scenario comprising a plurality of characteristic points pij, said wearable device (100) comprising:

— a frame (110) arranged to allow said user to wear said wearable device (100), said frame (110) defining a centre of projection Cp as ideal positioning of the nodal point of the eye of said user for having a consistent overlapping between said virtual image and said image of the real scenario, said centre of projection Cp having a known position with respect to a reference system S integral to said frame (110);

— at least one see-through display (120), at least partially transparent, comprising:

— a micro display (121) arranged to emit said virtual image;

— an ocular lens (122) having a focal distance ƒ1,

— an optical combiner (123) arranged to project said virtual image emitted by said micro display (121) in front of at least one eye of said user, said virtual image being focused by said user in a plane Πv at a distance V with respect to said centre of projection Cp;

— a tracking device arranged to carry out a localization of said characteristic points pij and of said wearable device (100) with respect to said reference system S, said characteristic points pij arranged on characteristic planes Πi located at distances di with respect to said centre of projection Cp;

— a control unit arranged to receive, from said tracking device, said localization of said characteristic points pij and of said wearable device (100) with respect to said reference system S and to operate said or each see-through display (120) for consistently overlapping said virtual image to said image of said real scenario at said centre of projection Cp; said system (10) characterized in that said control unit is adapted to:

— receive a value of a desired working distance , i.e. of a distance from the centre of projection Cp of a characteristic plane on which they lie the characteristic points pij on which you want to make the virtual image consistent;

— modify said distance V of said plane Πv with respect to said centre of projection Cp in such a way that .

Description:
TITLE

WEARABLE DEVICE FOR THE USE OF AUGMENTED REALITY

DESCRIPTION Field of the invention

The present invention relates to the field of augmented reality.

In particular, the invention relates to a wearable "optical see-through" viewer that allows an improved overlap of virtual content on a real scenario.

Description of the prior art

As is well known, optical see-through (OST) wearable systems allow the user to observe the world with their own eyes through a semi-transparent display on which virtual images are reproduced. In particular, the view is augmented by reproducing the virtual content on a two-dimensional micro display and projecting it, by means of a beam combiner, for example an appropriate optical guide, on a semi-transparent projection surface a comfortable sight distance (Holland and Fuchs 2000, Benton 2001).

A limitation of these viewers is given by the intrinsic difficulty of providing perfect alignment between the real- world view and the virtual images projected on the semi- transparent display. In particular, these systems are very sensitive to changes in focus by the user's eye and to changes in the relative position between the eye and the viewer.

The first aspect implies that these systems are effective, to date, only if the contents of the virtual guide are represented by simplified graphics, indications and/or writings, which do not require their precise location in the real environment nor a focus consistent with it.

The second aspect instead involves the so-called parallax error, that is when there is a variation in relative position between the eye and the viewer, there is a translation of the virtual image that is different from the translation of the real image.

A first solution consists in tracking the eye ("eye- tracking") through the use of a camera projected towards it. However, these systems usually require complex calibrations and adaptations to the user's ocular geometry, making this solution impractical for non-expert users.

Document US7809160B2 tries to solve this problem, proposing an apparatus and a method for detecting the gaze of the eyes that do not require camera calibration and specific measurements of the user's face geometry, thanks to an auto-detection system of the distance between the eyes.

However, even this system, although more effective, requires the presence of a dedicated camera inside the viewer, with a consequent increase in the cost of production and in the complexity of the device.

Furthermore, the eye tracking system allows you to determine the center of rotation of the eye, which however does not coincide with the center of projection, resulting in only partial resolution of the aforementioned parallax error.

Furthermore, the correct overlap of the virtual image on the image of the real scenario is today obtained starting from a manual calibration, or based on complex instrumentation and methods, carried out with respect to a specific position of the eye, of the projective parameters of an algorithm of renderings which are subsequently updated on the basis of the displacement of the eye with respect to said position. Summary of the invention

It is therefore an object of the present invention to provide an "optical see-through" wearable device for the use of augmented reality that allows to overlay virtual contents while maintaining consistency with respect to the real-world view, without being affected by changes in position and focus by of the user's eye.

It is also an object of the present invention to provide such a device that does not require the presence of eye tracking devices or device calibrations to adapt it to the geometry of the user's face and to the relative pose between the face and the device.

These and other objects are achieved by a system for the fruition of augmented reality according to claims from 1 to 11 and 13.

According to another aspect of the invention, a method for calibrating a system for the fruition of augmented reality according to claim 12 is also claimed.

Brief description of the drawings Further characteristic and/or advantages of the present invention are more bright with the following description of an exemplary embodiment thereof, exemplifying but not limitative, with reference to the attached drawings in which:

— Fig. 1 shows a system for the fruition of augmented reality according to the present invention;

— Fig. 2 schematically shows the operating principle of the wearable device.

Description of a preferred exemplary embodiment With reference to Fig. 1, the system 10 for the fruition of augmented reality by a user comprises a wearable device 100 comprising a frame 110 and at least one see-through display 120.

In particular, with reference even at Fig. 2, the wearable device 100 is adapted to project a virtual image overlapped to an image of a real scenario comprising a plurality of characteristic points p ij .

The frame 110 allows the user to wear the wearable device 100, allowing to define a centre of projection C p as ideal positioning of the nodal point of the eye of the user for having a consistent overlapping between the virtual image and the image of the real scenario.

The purpose of the present invention is to allow the correct overlapping between the virtual image and the image of the real scenario also in case that, due to movements of the frame with respect to the user's eye, the nodal point of the eye is not perfectly superimposed on the centre of projection C p .

The see-through display 120, at least partially transparent, comprises a micro display 121 arranged to emit the virtual image, an ocular lens 122 having a focal distance ƒ 1 , and an optical combiner 123 arranged to project the virtual image emitted by the micro display 121 in front of the eye of the user. In particular, the virtual image is focused by the user in a plane Π v at a distance V with respect to the centre of projection C p .

The system 10 then comprises a tracking device and a control unit, not shown in the figure.

In particular, the tracking device carries out a localization of the characteristic points p ij and of the wearable device 100 with respect to the reference system S, the characteristic points p ij being arranged on characteristic planes Π i located at distances d i with respect to the centre of projection C p . The control unit is instead arranged to receive, by the tracking device, the localization of the characteristic points p ij and of the wearable device 100 and to operate the see-through display 120 for consistently overlapping the virtual image to the image of the real scenario at the centre of projection C p .

In particular, the wearable device 100 also comprises at least one positive lens 130 having a scale factor A and a focal distance ƒ 2 . Such positive lens 130 is arranged, on the opposite side with respect to eyes of the user, in a plane Μ R at a distance R with respect to the centre of projection C p .

This way, each characteristic point p ij located in a characteristic plane Π i · at a distance with respect to the centre of projection C p is focused in the plane Π v with the virtual image whatever is the position of the eye of the user with respect to the centre of projection C p .

This way, defined the distance V, possibly also as an infinite value, and defined the desired working distance , i.e. the distance from the centre of projection C p of the characteristic plane on which they lie the characteristic points p ij of the objects on which you want to make the virtual image consistent, it is possible to define the focal distance ƒ 2 in such a way that , obtaining coherence between the virtual image and the objects at the working distance , even if the nodal point of the eye is moved by the centre of projection C p .

Vice-versa, having defined the distance V and fixed the focal distance ƒ 2 , the virtual image is coherent with respect to the characteristic points p ij arranged in the characteristic plane located at a distance with respect to the centre of projection C p , even if the nodal point of the eye is moved by the centre of projection C p .

In the first case, it is possible to suitably vary the focal distance ƒ 2 for having a coherent virtual image at the desired distance d . This can be done by choosing a positive lens 130 with a suitable focal distance ƒ 2 , or using a lens 130 having a variable focal distance ƒ 2 , such as a liquid glycerol-based lens.

In particular, the ocular lens 122 can be arranged substantially at a distance equal to the focal distance with respect to the micro display 121, in such a way that the user focuses the virtual image at the infinite and that the characteristic points p ij of the real scenario located at a distance equal to the focal distance ƒ 2 with respect to the positive lens 130 are also focused at the infinite with respect to the user.

In an alternative embodiment of the invention, the wearable device 100 does not comprise the positive lens 130 and the control unit is arranged to:

— receive a value of a desired working distance , i.e. of a distance from the centre of projection C p of a characteristic plane on which they lie the characteristic points p ij on which you want to make the virtual image consistent;

— modify the distance V of the plane Π v with respect to the centre of projection C p in such a way that

The invention also provides a calibration step of the system 10 for the fruition of augmented reality arranged to identify the position of the centre of projection C p with respect to the reference system S.

In particular, the method comprises the steps of:

— arranging a calibration chamber, in order to have a centre of projection C c known with respect to the reference system S integral to the frame 110, said calibration chamber being pointed towards the see-through display 120; projecting by the see-through display 120 an image comprising a plurality of characteristic calibration points;

— acquiring by means of the calibration chamber the image projected by the see-through display 120 and identifying in the acquired image the characteristic calibration points;

— comparing the image acquired by the calibration chamber with the image projected by the display 120 for determining a transformation matrix which transforms the plurality of characteristic points of the projected image in the plurality of characteristic points identified in the acquired image;

— starting from the transformation matrix, computing a three-dimensional translation vector arranged to define a spatial distance between the centre of projection C c and the centre of projection C p , with subsequent spatial identification of the centre of projection C p with respect to the reference system S.

In substance, the comparison between the image as it is projected by the display 120 and as it is acquired by the calibration chamber, placed in a different position with respect to the projector of the display 120, allows to identify the change of observation point between the two images, allowing to calculate the centre of projection C p , and allowing to consistently overlap the virtual image to the image of the real scenario at a distance d = V, even if the nodal point of the eye is moved by the centre of projection C p . Such calibration in prior art is carried out manually, involving the presence of an expert operator who carries out an appropriate calibration on each device and each user. The present invention allows instead a totally automatic calibration by the system. The foregoing description some exemplary specific embodiments will so fully reveal the invention according to the conceptual point of view, so that others, by applying current knowledge, will be able to modify and/or adapt in various applications the specific exemplary embodiments without further research and without parting from the invention, and, accordingly, it is meant that such adaptations and modifications will have to be considered as equivalent to the specific embodiments. The means and the materials to realise the different functions described herein could have a different nature without, for this reason, departing from the field of the invention, it is to be understood that the phraseology or terminology that is employed herein is for the purpose of description and not of limitation.