Commonly, surface and solid haptic effects are defined in such a way that they hardly can be rendered together. We propose a method for defining mixed haptic effects including surface, solid, and force fields. These haptic effects can be applied to virtual scenes containing various objects, including polygon meshes, point clouds, impostors, and layered textures, voxel models as well as function-based shapes. Accordingly, we propose a way how to identify location of the haptic tool in such virtual scenes as well as consistently and seamlessly determine haptic effects when the haptic tool moves in the scenes with objects having different sizes, locations, and mutual penetrations. To provide for an efficient and flexible rendering of haptic effects, we propose to concurrently use explicit, implicit and parametric functions, and algorithmic procedures.
Field of Research
080111 Virtual Reality and Related Simulation
Socio Economic Objective
970108 Expanding Knowledge in the Information and Computing Sciences