ProximitySensor Example
Many people have asked how to have a shape, or group of shapes, keeping
its relative position to the user when the user is moving. This example
shows you how to do that.
ProximitySensor nodes can be
used to keep track of the users position and orientation. Two eventOuts
are provided for this effect:
-
position_changed
-
orientation_changed
Therefore a ProximitySensor node
generates events whenever the user changes its position or orientation.
These events can then be routed
to a Transform node where the
shapes are placed.
The only problem with this method is that a ProximitySensor requires
the definition of the size of a virtual box. If the user is outside
the virtual box then the ProximitySensor will NOT generate events. To avoid
this problem one can always define the size of the ProximitySensor
to be larger than the world itself.
The following code should do the trick:
#VRML V2.0 utf8
Group {
children [
DEF ps ProximitySensor {
center 0 0 0
size 1000 1000 1000
}
DEF tr Transform {
children
Transform {
translation 0 0 -5
children
Shape {geometry Sphere{}}
}
}
]
}
ROUTE ps.position_changed TO tr.set_translation
ROUTE ps.orientation_changed TO tr.set_rotation
|
Note that the sphere which is 'locked' to the user position is inside a
Transform node which contains
a translation. This translation defines the relative position
to the user, in this case the center of the sphere will be 5 units away
from the user.
Press the button below to see the VRML. Note that there is also a box
in the world. The box should remain in its position while you move, only
the sphere will follow your movement.