I conduct interdisciplinary research in the areas of man-machine interfaces (in particular haptics), control systems, mechatronics, robotics, biomechanics, physics-based modeling and simulation, 3D computer graphics, and virtual reality technology (see Google Scholar). Before joining to Koc University, I was a senior member of technical staff at NASA-Jet Propulsion Laboratory of California Institute of Technology from 1999 to 2002. I moved to JPL from Massachusetts Institute of Technology where I was a research scientist and principal investigator at MIT Research Laboratory of Electronics and a member of the MIT Touch Lab from 1996-1999. I received my Ph.D. degree from Southern Methodist University in 1994 and worked as a research scientist at Northwestern University Research Park (with MusculoGraphics Inc.) for two years before moving to Boston. Phone: 90+ 212 338 1721 Address: College of Engineering, Koc University Istanbul, 34450, Turkey
Haptics provides a natural and intuitive channel of communication during the interaction of two h... more Haptics provides a natural and intuitive channel of communication during the interaction of two humans in complex physical tasks, such as joint object transportation. However, despite the utmost importance of touch in physical interactions, the use of haptics is underrepresented when developing intelligent systems. This study explores the prominence of haptic data to extract information about underlying interaction patterns within human-human cooperation. For this purpose, we design salient haptic features describing the collaboration quality within a physical dyadic task and investigate the use of these features to classify the interaction patterns. We categorize the interaction into four discrete behavior classes. These classes describe whether the partners work in harmony or face conflicts while jointly transporting an object through translational or rotational movements. We test the proposed features on a physical human-human interaction (pHHI) dataset, consisting of data collected from 12 human dyads. Using these data, we verify the salience of haptic features by achieving a correct classification rate over 91% using a Random Forest classifier.
In the near future, humans and robots are expected to perform collaborative tasks involving physi... more In the near future, humans and robots are expected to perform collaborative tasks involving physical interaction in various environments , such as homes, hospitals, and factories. Robots are good at precision, strength, and repetition, while humans are better at cognitive tasks. The concept, known as physical human-robot interaction (pHRI), takes advantage of these abilities and is highly beneficial by bringing speed, flexibility, and ergonomics to the execution of complex tasks. Current research in pHRI focuses on designing controllers and developing new methods which offer a better tradeoff between robust stability and high interaction performance. In this paper, we propose a new controller, fractional order admittance controller, for pHRI systems. The stability and transparency analyses of the new control system are performed computationally with human-in-the-loop. Impedance matching is proposed to map fractional order control parameters to integer order ones, and then the stability robustness of the system is studied analytically. Furthermore, the interaction performance is investigated experimentally through two human subject studies involving continuous contact with linear and nonlinear viscoelastic environments. The results indicate that the fractional order admittance controller can be made more robust and transparent than the integer order admittance controller and the use of fractional order term can reduce the human effort during tasks involving contact interactions with environment.
Physical human-robot interaction (pHRI) integrates the benefits of human operator and a collabora... more Physical human-robot interaction (pHRI) integrates the benefits of human operator and a collaborative robot in tasks involving physical interaction, with the aim of increasing the task performance. However, design of interaction controllers that achieve safe and transparent operations is challenging mainly due to the contradicting nature of these objectives. Knowing that attaining perfect transparency is practically unachievable, controllers that allow better compromise between these objectives are desirable. In this paper, we propose a multi-criteria optimization framework, which jointly optimizes the stability robustness and transparency of a closed-loop pHRI system for a given interaction controller. In particular, we propose a Pareto optimization framework that allows the designer to make informed decisions by studying the trade-off between stability robustness and transparency thoroughly. The proposed framework involves a search over the discretized controller parameter space to compute the Pareto front curve depicting the trade-off between stability robustness and transparency. Studying this trade-off, a decision can be made to select the set of controller parameters that yield maximum attainable transparency and stability robustness. Hence, the proposed framework not only leads to the design of an optimal controller but also enables a fair comparison among different controllers. In order to demonstrate the practical use of the proposed approach, integer and fractional order admittance controllers are studied as a case study and compared both analytically and experimentally. The experimental results validate the proposed design framework and also show that the achievable transparency under fractional order admittance controller is higher than that of integer order one, when both controllers are designed to ensure the same level of stability robustness. Index Terms-Multi-criteria optimization, Pareto optimization, interaction controllers, fractional order control, physical human-robot interaction, trade-off between stability robustness and transparency.
An active research goal for human-computer interaction is to allow humans to communicate with com... more An active research goal for human-computer interaction is to allow humans to communicate with computers in an intuitive and natural fashion, especially in real life interaction scenarios. One approach that has been advocated to achieve this has been to build computer systems with human-like qualities and capabilities. In this paper, we present insight on how human-computer interaction can be enriched by employing the computers with behavioral patterns that naturally appear in human-human negotiation scenarios. For this purpose, we introduce a two-party negotiation game specifically built for studying the effectiveness of haptic and audiovisual cues in conveying negotiation related behaviors. The game is centered around a real-time continuous two-party negotiation scenario based on the existing game-theory and negotiation literature. During the game, humans are confronted with a computer opponent, which can display different behaviors, such as concession, competition, and negotiation. Through a user study, we show that the behaviors that are associated with human negotiation can be incorporated into human-computer interaction, and the addition of haptic cues provides a statistically significant increase in the human-recognition accuracy of machine-displayed behaviors. In addition to aspects of conveying these negotiation-related behaviors, we also focus on and report game-theoretical aspects of the overall interaction experience. In particular, we show that, as reported in the game-theory literature, certain negotiation strategies such as tit-for-tat may generate maximum combined utility for the negotiating parties, providing an excellent balance between the energy spent by the user and the combined utility of the negotiating parties.
Proceedings of IEEE International Conference on Robot and Human Interactive Communication (Ro-Man), 2020
With the recent advances in cobot (collaborative robot) technology, we can now work with a robot ... more With the recent advances in cobot (collaborative robot) technology, we can now work with a robot side by side in manufacturing environments. The collaboration between human and cobot can be enhanced by detecting the intentions of human to make the production more flexible and effective in future factories. In this regard, interpreting human intention and then adjusting the controller of cobot accordingly to assist human is a core challenge in physical human-robot interaction (pHRI). In this study, we propose a classifier based on Artificial Neural Networks (ANN) that predicts intended direction of human movement by utilizing electromyography (EMG) signals acquired from human arm muscles. We employ this classifier in an admittance control architecture to constrain human arm motion along the intended direction and prevent undesired movements along other directions. The proposed classifier and the control architecture have been validated through a path following task by utilizing a KUKA LBR iiwa 7 R800 cobot. The results of our experimental study with 6 participants show that the proposed architecture provides an effective assistance to human during the execution of task and reduces undesired motion errors, while not sacrificing from the task completion time.
Proceedings of 17th IEEE Symposium on Haptics, 2010
We investigate how collaborative guidance can be realized in multi-modal virtual environments for... more We investigate how collaborative guidance can be realized in multi-modal virtual environments for dynamic tasks involving motor control. Haptic guidance in our context can be defined as any form of force/tactile feedback that the computer generates to help a user execute a task in a faster, more accurate, and subjectively more pleasing fashion. In particular, we are interested in determining guidance mechanisms that best facilitate task performance and arouse a natural sense of collaboration. We suggest that a haptic guidance system can be further improved if it is supplemented with a role exchange mechanism, which allows the computer to adjust the forces it applies to the user in response to his/her actions. Recent work on collaboration and role exchange presented new perspectives on defining roles and interaction. However existing approaches mainly focus on relatively basic environments where the state of the system can be defined with a few parameters. We designed and implemented a complex and highly dynamic multimodal game for testing our interaction model. Since the state space of our application is complex, role exchange needs to be implemented carefully. We defined a novel negotiation process, which facilitates dynamic communication between the user and the computer, and realizes the exchange of roles using a three-state finite state machine. Our preliminary results indicate that even though the negotiation and role exchange mechanism we adopted does not improve performance in every evaluation criteria, it introduces a more personal and human-like interaction model.
Proceedings of IEEE International Conference on Robotics and Automation (ICRA), 2020
In today's automation driven manufacturing environments , emerging technologies like cobots (coll... more In today's automation driven manufacturing environments , emerging technologies like cobots (collaborative robots) and augmented reality interfaces can help integrating humans into the production workflow to benefit from their adaptability and cognitive skills. In such settings, humans are expected to work with robots side by side and physically interact with them. However, the trade-off between stability and transparency is a core challenge in the presence of physical human robot interaction (pHRI). While stability is of utmost importance for safety, transparency is required for fully exploiting the precision and ability of robots in handling labor intensive tasks. In this work, we propose a new variable admittance controller based on fractional order control to handle this trade-off more effectively. We compared the performance of fractional order variable admittance controller with a classical admittance controller with fixed parameters as a baseline and an integer order variable admittance controller during a realistic drilling task. Our comparisons indicate that the proposed controller led to a more transparent interaction compared to the other controllers without sacrificing the stability. We also demonstrate a use case for an augmented reality (AR) headset which can augment human sensory capabilities for reaching a certain drilling depth otherwise not possible without changing the role of the robot as the decision maker.
The development of robots that can physically cooperate with humans has attained interest in the ... more The development of robots that can physically cooperate with humans has attained interest in the last decades. Obviously, this effort requires a deep understanding of the intrinsic properties of interaction. Up to now, many researchers have focused on inferring human intents in terms of intermediate or terminal goals in physical tasks. On the other hand, working side by side with people, an autonomous robot additionally needs to come up with in-depth information about underlying haptic interaction patterns that are typically encountered during human-human cooperation. However, to our knowledge, no study has yet focused on characterizing such detailed information. In this sense, this work is pioneering as an effort to gain deeper understanding of interaction patterns involving two or more humans in a physical task. We present a labeled human-human-interaction dataset, which captures the interaction of two humans, who collaboratively transport an object in an haptics-enabled virtual environment. In the light of information gained by studying this dataset, we propose that the actions of cooperating partners can be examined under three interaction types: In any cooperative task, the interacting humans either 1) work in harmony, 2) cope with conflicts, or 3) remain passive during interaction. In line with this conception, we present a taxonomy of human interaction patterns; then propose five different feature sets, comprising force-, velocity-and power-related information, for the classification of these patterns. Our evaluation shows that using a multi-class support vector machine (SVM) classifier, we can accomplish a correct classification rate of 86 percent for the identification of interaction patterns, an accuracy obtained by fusing a selected set of most informative features by Minimum Redundancy Maximum Relevance (mRMR) feature selection method.
Since the strict separation of working spaces of humans and robots has experienced a softening du... more Since the strict separation of working spaces of humans and robots has experienced a softening due to recent robotics research achievements, close interaction of humans and robots comes rapidly into reach. In this context, physical human-robot interaction raises a number of questions regarding a desired intuitive robot behavior. The continuous bilateral information and energy exchange requires an appropriate continuous robot feedback. Investigating a cooperative manipulation task, the desired behavior is a combination of an urge to fulfill the task, a smooth instant reactive behavior to human force inputs and an assignment of the task effort to the cooperating agents. In this paper, a formal analysis of human-robot cooperative load transport is presented. Three different possibilities for the assignment of task effort are proposed. Two proposed dynamic role exchange mechanisms adjust the robot's urge to complete the task based on the human feedback. For comparison, a static role allocation strategy not relying on the human agreement feedback is investigated as well. All three role allocation mechanisms are evaluated in a user study that involves large-scale kinesthetic interaction and full-body human motion. Results show tradeoffs between subjective and objective performance measures stating a clear objective advantage of the proposed dynamic role allocation scheme.
In human-computer collaboration involving haptics, a key issue that remains to be solved is to es... more In human-computer collaboration involving haptics, a key issue that remains to be solved is to establish an intuitive communication between the partners. Even though computers are widely used to aid human operators in teleoperation, guidance, and training, since they lack the adaptability, versatility, and awareness of a human, their ability to improve efficiency and effectiveness in dynamic tasks is limited. We suggest that the communication between a human and a computer can be improved if it involves a decision making process in which the computer is programmed to infer the intentions of the human operator and dynamically adjust the control levels of the interacting parties to facilitate a more intuitive interaction setup. In this paper, we investigate the utility of such a dynamic role exchange mechanism where partners negotiate through the haptic channel to trade their control levels on a collaborative task. We examine the energy consumption, the work done on the manipulated object, and the joint efficiency in addition to the task performance. We show that when compared to an equal control condition, a role exchange mechanism improves task performance and the joint efficiency of the partners. We also show that augmenting the system with additional informative visual and vibrotactile cues, which are used to display the state of interaction, allows the users to become aware of the underlying role exchange mechanism and utilize it in favor of the task. These cues also improve the user's sense of interaction and reinforce his/her belief that the computer aids with the execution of the task.
Future touch screen applications will include multiple tactile stimuli displayed simultaneously o... more Future touch screen applications will include multiple tactile stimuli displayed simultaneously or consecutively to single finger or multiple fingers. These applications should be designed by considering human tactile masking mechanism since it is known that presenting one stimulus may interfere with the perception of the other. In this study, we investigate the effect of masking on the tactile perception of electrovibration displayed on touch screens. Through conducting psychophysical experiments with nine subjects, we measured the masked thresholds of sinusoidal electrovibration bursts (125 Hz) under two masking conditions: simultaneous and pedestal. The masking stimuli were noise bursts, applied at five different sensation levels varying from 2 to 22 dB SL, also presented by electrovibration. For each subject, the detection thresholds were elevated as linear functions of masking levels for both masking types. We observed that the masking effectiveness was larger with pedestal masking than simultaneous masking. Moreover, in order to investigate the effect of tactile masking on our haptic perception of edge sharpness, we compared the perceived sharpness of edges separating two textured regions displayed with and without various masking stimuli. Our results suggest that sharpness perception depends on the local contrast between background and foreground stimuli, which varies as a function of masking amplitude and activation levels of frequency-dependent psychophysical channels.
In this study, we investigated the effect of input voltage waveform on our haptic perception of e... more In this study, we investigated the effect of input voltage waveform on our haptic perception of electrovibration on touch screens. Through psychophysical experiments performed with eight subjects, we first measured the detection thresholds of electrovibration stimuli generated by sinusoidal and square voltages at various fundamental frequencies. We observed that the subjects were more sensitive to stimuli generated by square wave voltage than sinusoidal one for frequencies lower than 60 Hz. Using Matlab simulations, we showed that the sensation difference of waveforms in low fundamental frequencies occurred due to the frequency-dependent electrical properties of human skin and human tactile sensitivity. To validate our simulations, we conducted a second experiment with another group of eight subjects. We first actuated the touch screen at the threshold voltages estimated in the first experiment and then measured the contact force and acceleration acting on the index fingers of the subjects moving on the screen with a constant speed. We analyzed the collected data in the frequency domain using the human vibrotactile sensitivity curve. The results suggested that Pacinian channel was the primary psychophysical channel in the detection of the electrovibration stimuli caused by all the square-wave inputs tested in this study. We also observed that the measured force and acceleration data were affected by finger speed in a complex manner suggesting that it may also affect our haptic perception accordingly.
We review the current technology underlying surface haptics that converts passive touch surfaces ... more We review the current technology underlying surface haptics that converts passive touch surfaces to active ones (machine haptics), our perception of tactile stimuli displayed through active touch surfaces (human haptics), their potential applications (human-machine interaction), and finally the challenges ahead of us in making them available through commercial systems. This review primarily covers the tactile interactions of human fingers or hands with surface-haptics displays by focusing on the three most popular actuation methods: vibrotactile, electrostatic, and ultrasonic.
Masking has been used to study human perception of tactile stimuli, including those created by el... more Masking has been used to study human perception of tactile stimuli, including those created by electrovibration on touch screens. Earlier studies have investigated the effect of on-site masking on tactile perception of electrovibration. In this article, we investigated whether it is possible to change the absolute detection threshold and intensity difference threshold of electrovibration at the fingertip of index finger via remote masking, i.e., by applying a (mechanical) vibrotactile stimulus on the proximal phalanx of the same finger. The masking stimuli were generated by a voice coil (the Haptuator). For 16 participants, we first measured the detection thresholds for electrovibration at the fingertip and for vibrotactile stimuli at the proximal phalanx. Then, the vibrations on the skin were measured at four different locations on the index finger of subjects to investigate how the mechanical masking stimulus propagated as the masking level was varied. Later, masked absolute thresholds of eight participants were measured. Finally, for another group of eight participants, intensity difference thresholds were measured in the presence/absence of vibrotactile masking stimuli. Our results show that vibrotactile masking stimuli generated sub-threshold vibrations around the fingertip, and hence, probably did not mechanically interfere with the electrovibration stimulus. However, there was a clear psychophysical masking effect due to central neural processes. We measured the effect of masking stimuli, up to 40 dB SL, on the difference threshold at four different intensity standards of electrovibration. We proposed two models based on hypothetical neural signals for prediction of the masking effect on intensity difference thresholds for electrovibration: amplitude and energy models. The energy model was able to predict the effect of masking more accurately, especially at high intensity masking levels.
Tactile discrimination and roughness perception of real textures are extensively studied and unde... more Tactile discrimination and roughness perception of real textures are extensively studied and underlying perceptual mechanisms are relatively well-established. However, tactile perception of virtual textures rendered by friction modulation techniques on touch surfaces has not been investigated in detail yet. In this study, we investigated our ability to discriminate two consecutive step changes in friction (called edges), followed by discrimination and roughness perception of multiple edges (called periodic gratings). The results showed that discrimination of two consecutive edges was significantly influenced by edge sequence: a step fall in friction (F F) followed by a step rise in friction (RF) was discriminated more easily than the reverse order. On the other hand, periodic gratings displayed by consecutive sequences of F F followed by RF were perceived with the same acuity as compared to vice versa. Independent of the edge sequence, we found that a relative difference of 14% in spatial period was required to discriminate two periodic gratings. Moreover, the roughness perception of periodic gratings decreased with increasing spatial period for the range that we have investigated (spatial period > 2 mm), despite the lack of spatial cues on grating height. We also observed that rate of change in friction coefficient was better correlated with the roughness perception than the friction coefficient itself. These results will further help to understand and design virtual textures for touch surfaces.
IEEE Transactions on Haptics, Vol. 11, No. 4, pp. 599-610, 2018
To render tactile cues on a touchscreen by friction modulation, it is important to understand how... more To render tactile cues on a touchscreen by friction modulation, it is important to understand how human perceive a change in friction. In this study, we investigate the relations between perceived change in friction on an ultrasonically actuated touchscreen and parameters involved in contact between finger and its surface. We first estimate the perceptual thresholds to detect rising and falling friction while finger is sliding on the touch surface. Then, we conduct intensity scaling experiments and investigate the effect of finger sliding velocity, normal force, and rise/fall time of vibration amplitude (transition time) on the perceived intensity of change in friction. In order to better understand the role of contact mechanics, we also look into the correlations between the perceived intensities of subjects and several parameters involved in contact. The results of our experiments show that the contrast and rate of change in tangential force were best correlated with the perceived intensity. The subjects perceived rising friction more strongly than falling friction, particularly at higher tangential force contrast. We argue that this is due to hysteresis and viscoelastic behavior of fingertip under tangential loading. The results also showed that transition time and normal force have significant effect on our tactile perception.
Displaying tactile feedback through a touchscreen via electrovibration has many potential applica... more Displaying tactile feedback through a touchscreen via electrovibration has many potential applications in mobile devices, consumer electronics, home appliances, and automotive industry though our knowledge and understanding on the underlying contact mechanics is very limited. An experimental study was conducted to investigate the contact evolution between the human finger and a touchscreen under electrovibration using a robotic setup and an imaging system. The results show that the effect of electrovibration is only present during full slip but not before slip. Hence, coefficient of friction increases under electrovibration as expected during full slip, but the apparent contact area is significantly smaller during full slip when compared to that of no electrovibration condition. It is suggested that the main cause of the increase in friction during full slip is due to an increase in real contact area and the reduction in apparent area is due to stiffening of the finger skin in tangential direction.
There is a growing interest on touchscreens displaying tactile feedback due to their tremendous p... more There is a growing interest on touchscreens displaying tactile feedback due to their tremendous potential in consumer electronics. In these systems, the friction between user's fingerpad and the surface of touchscreen is modulated to display tactile effects. One of the promising techniques used in this regard is electrostatic actuation. If, for example, an alternating voltage is applied to the conductive layer of a surface capacitive touchscreen, an attractive electrostatic force is generated between finger and the surface which results in an increase in frictional forces acting on the finger moving on the surface. By altering the amplitude, frequency, and the waveform of this signal, a rich set of tactile effects can be generated on the touchscreen. Despite the ease of implementation and its powerful effect on our tactile sensation, the contact mechanics leading to an increase in friction due to electroadhesion has not been fully understood yet. In this paper, we present experimental results for how the friction between a finger and a touchscreen depends on the electrostatic attraction and the applied normal pressure. The dependency of the finger-touchscreen interaction on the applied voltage, and on several other parameters, is also investigated using a mean field theory based on multiscale contact mechanics. We present detailed theoretical analysis of how the area of real contact and the friction force depend on contact parameters, and show that it is possible to further augment the friction force, and hence the tactile feedback displayed to the user by carefully choosing those parameters.
IEEE Transactions on Haptics, Vol. 13, No. 1, pp. 137-143, 2020
Rendering tactile effects on a touch screen via electrovibration has many potential applications.... more Rendering tactile effects on a touch screen via electrovibration has many potential applications. However, our knowledge on tactile perception of change in friction and the underlying contact mechanics are both very limited. In this article, we investigate the tactile perception and the contact mechanics for a step change in friction under electrovibration during a relative sliding between a finger and the surface of a capacitive touch screen. First, we conduct magnitude estimation experiments to investigate the role of normal force and sliding velocity on the perceived tactile intensity for a step increase and decrease in friction, called rising friction (RF) and falling friction (FF). To investigate the contact mechanics involved in RF and FF, we then measure the frictional force, the apparent contact area, and the strains acting on the fingerpad during sliding at a constant velocity under three different normal loads using a custom-made experimental setup. The results show that the participants perceived RF stronger than FF, and both the normal force and sliding velocity significantly influenced their perception. These results are supported by our mechanical measurements; the relative change in friction, the apparent contact area, and the strain in the sliding direction were all higher for RF than those for FF, especially for low normal forces. Taken together, our results suggest that different contact mechanics take place during RF and FF due to the viscoelastic behavior of fingerpad skin, and those differences influence our tactile perception of a step change in friction.
Haptics provides a natural and intuitive channel of communication during the interaction of two h... more Haptics provides a natural and intuitive channel of communication during the interaction of two humans in complex physical tasks, such as joint object transportation. However, despite the utmost importance of touch in physical interactions, the use of haptics is underrepresented when developing intelligent systems. This study explores the prominence of haptic data to extract information about underlying interaction patterns within human-human cooperation. For this purpose, we design salient haptic features describing the collaboration quality within a physical dyadic task and investigate the use of these features to classify the interaction patterns. We categorize the interaction into four discrete behavior classes. These classes describe whether the partners work in harmony or face conflicts while jointly transporting an object through translational or rotational movements. We test the proposed features on a physical human-human interaction (pHHI) dataset, consisting of data collected from 12 human dyads. Using these data, we verify the salience of haptic features by achieving a correct classification rate over 91% using a Random Forest classifier.
In the near future, humans and robots are expected to perform collaborative tasks involving physi... more In the near future, humans and robots are expected to perform collaborative tasks involving physical interaction in various environments , such as homes, hospitals, and factories. Robots are good at precision, strength, and repetition, while humans are better at cognitive tasks. The concept, known as physical human-robot interaction (pHRI), takes advantage of these abilities and is highly beneficial by bringing speed, flexibility, and ergonomics to the execution of complex tasks. Current research in pHRI focuses on designing controllers and developing new methods which offer a better tradeoff between robust stability and high interaction performance. In this paper, we propose a new controller, fractional order admittance controller, for pHRI systems. The stability and transparency analyses of the new control system are performed computationally with human-in-the-loop. Impedance matching is proposed to map fractional order control parameters to integer order ones, and then the stability robustness of the system is studied analytically. Furthermore, the interaction performance is investigated experimentally through two human subject studies involving continuous contact with linear and nonlinear viscoelastic environments. The results indicate that the fractional order admittance controller can be made more robust and transparent than the integer order admittance controller and the use of fractional order term can reduce the human effort during tasks involving contact interactions with environment.
Physical human-robot interaction (pHRI) integrates the benefits of human operator and a collabora... more Physical human-robot interaction (pHRI) integrates the benefits of human operator and a collaborative robot in tasks involving physical interaction, with the aim of increasing the task performance. However, design of interaction controllers that achieve safe and transparent operations is challenging mainly due to the contradicting nature of these objectives. Knowing that attaining perfect transparency is practically unachievable, controllers that allow better compromise between these objectives are desirable. In this paper, we propose a multi-criteria optimization framework, which jointly optimizes the stability robustness and transparency of a closed-loop pHRI system for a given interaction controller. In particular, we propose a Pareto optimization framework that allows the designer to make informed decisions by studying the trade-off between stability robustness and transparency thoroughly. The proposed framework involves a search over the discretized controller parameter space to compute the Pareto front curve depicting the trade-off between stability robustness and transparency. Studying this trade-off, a decision can be made to select the set of controller parameters that yield maximum attainable transparency and stability robustness. Hence, the proposed framework not only leads to the design of an optimal controller but also enables a fair comparison among different controllers. In order to demonstrate the practical use of the proposed approach, integer and fractional order admittance controllers are studied as a case study and compared both analytically and experimentally. The experimental results validate the proposed design framework and also show that the achievable transparency under fractional order admittance controller is higher than that of integer order one, when both controllers are designed to ensure the same level of stability robustness. Index Terms-Multi-criteria optimization, Pareto optimization, interaction controllers, fractional order control, physical human-robot interaction, trade-off between stability robustness and transparency.
An active research goal for human-computer interaction is to allow humans to communicate with com... more An active research goal for human-computer interaction is to allow humans to communicate with computers in an intuitive and natural fashion, especially in real life interaction scenarios. One approach that has been advocated to achieve this has been to build computer systems with human-like qualities and capabilities. In this paper, we present insight on how human-computer interaction can be enriched by employing the computers with behavioral patterns that naturally appear in human-human negotiation scenarios. For this purpose, we introduce a two-party negotiation game specifically built for studying the effectiveness of haptic and audiovisual cues in conveying negotiation related behaviors. The game is centered around a real-time continuous two-party negotiation scenario based on the existing game-theory and negotiation literature. During the game, humans are confronted with a computer opponent, which can display different behaviors, such as concession, competition, and negotiation. Through a user study, we show that the behaviors that are associated with human negotiation can be incorporated into human-computer interaction, and the addition of haptic cues provides a statistically significant increase in the human-recognition accuracy of machine-displayed behaviors. In addition to aspects of conveying these negotiation-related behaviors, we also focus on and report game-theoretical aspects of the overall interaction experience. In particular, we show that, as reported in the game-theory literature, certain negotiation strategies such as tit-for-tat may generate maximum combined utility for the negotiating parties, providing an excellent balance between the energy spent by the user and the combined utility of the negotiating parties.
Proceedings of IEEE International Conference on Robot and Human Interactive Communication (Ro-Man), 2020
With the recent advances in cobot (collaborative robot) technology, we can now work with a robot ... more With the recent advances in cobot (collaborative robot) technology, we can now work with a robot side by side in manufacturing environments. The collaboration between human and cobot can be enhanced by detecting the intentions of human to make the production more flexible and effective in future factories. In this regard, interpreting human intention and then adjusting the controller of cobot accordingly to assist human is a core challenge in physical human-robot interaction (pHRI). In this study, we propose a classifier based on Artificial Neural Networks (ANN) that predicts intended direction of human movement by utilizing electromyography (EMG) signals acquired from human arm muscles. We employ this classifier in an admittance control architecture to constrain human arm motion along the intended direction and prevent undesired movements along other directions. The proposed classifier and the control architecture have been validated through a path following task by utilizing a KUKA LBR iiwa 7 R800 cobot. The results of our experimental study with 6 participants show that the proposed architecture provides an effective assistance to human during the execution of task and reduces undesired motion errors, while not sacrificing from the task completion time.
Proceedings of 17th IEEE Symposium on Haptics, 2010
We investigate how collaborative guidance can be realized in multi-modal virtual environments for... more We investigate how collaborative guidance can be realized in multi-modal virtual environments for dynamic tasks involving motor control. Haptic guidance in our context can be defined as any form of force/tactile feedback that the computer generates to help a user execute a task in a faster, more accurate, and subjectively more pleasing fashion. In particular, we are interested in determining guidance mechanisms that best facilitate task performance and arouse a natural sense of collaboration. We suggest that a haptic guidance system can be further improved if it is supplemented with a role exchange mechanism, which allows the computer to adjust the forces it applies to the user in response to his/her actions. Recent work on collaboration and role exchange presented new perspectives on defining roles and interaction. However existing approaches mainly focus on relatively basic environments where the state of the system can be defined with a few parameters. We designed and implemented a complex and highly dynamic multimodal game for testing our interaction model. Since the state space of our application is complex, role exchange needs to be implemented carefully. We defined a novel negotiation process, which facilitates dynamic communication between the user and the computer, and realizes the exchange of roles using a three-state finite state machine. Our preliminary results indicate that even though the negotiation and role exchange mechanism we adopted does not improve performance in every evaluation criteria, it introduces a more personal and human-like interaction model.
Proceedings of IEEE International Conference on Robotics and Automation (ICRA), 2020
In today's automation driven manufacturing environments , emerging technologies like cobots (coll... more In today's automation driven manufacturing environments , emerging technologies like cobots (collaborative robots) and augmented reality interfaces can help integrating humans into the production workflow to benefit from their adaptability and cognitive skills. In such settings, humans are expected to work with robots side by side and physically interact with them. However, the trade-off between stability and transparency is a core challenge in the presence of physical human robot interaction (pHRI). While stability is of utmost importance for safety, transparency is required for fully exploiting the precision and ability of robots in handling labor intensive tasks. In this work, we propose a new variable admittance controller based on fractional order control to handle this trade-off more effectively. We compared the performance of fractional order variable admittance controller with a classical admittance controller with fixed parameters as a baseline and an integer order variable admittance controller during a realistic drilling task. Our comparisons indicate that the proposed controller led to a more transparent interaction compared to the other controllers without sacrificing the stability. We also demonstrate a use case for an augmented reality (AR) headset which can augment human sensory capabilities for reaching a certain drilling depth otherwise not possible without changing the role of the robot as the decision maker.
The development of robots that can physically cooperate with humans has attained interest in the ... more The development of robots that can physically cooperate with humans has attained interest in the last decades. Obviously, this effort requires a deep understanding of the intrinsic properties of interaction. Up to now, many researchers have focused on inferring human intents in terms of intermediate or terminal goals in physical tasks. On the other hand, working side by side with people, an autonomous robot additionally needs to come up with in-depth information about underlying haptic interaction patterns that are typically encountered during human-human cooperation. However, to our knowledge, no study has yet focused on characterizing such detailed information. In this sense, this work is pioneering as an effort to gain deeper understanding of interaction patterns involving two or more humans in a physical task. We present a labeled human-human-interaction dataset, which captures the interaction of two humans, who collaboratively transport an object in an haptics-enabled virtual environment. In the light of information gained by studying this dataset, we propose that the actions of cooperating partners can be examined under three interaction types: In any cooperative task, the interacting humans either 1) work in harmony, 2) cope with conflicts, or 3) remain passive during interaction. In line with this conception, we present a taxonomy of human interaction patterns; then propose five different feature sets, comprising force-, velocity-and power-related information, for the classification of these patterns. Our evaluation shows that using a multi-class support vector machine (SVM) classifier, we can accomplish a correct classification rate of 86 percent for the identification of interaction patterns, an accuracy obtained by fusing a selected set of most informative features by Minimum Redundancy Maximum Relevance (mRMR) feature selection method.
Since the strict separation of working spaces of humans and robots has experienced a softening du... more Since the strict separation of working spaces of humans and robots has experienced a softening due to recent robotics research achievements, close interaction of humans and robots comes rapidly into reach. In this context, physical human-robot interaction raises a number of questions regarding a desired intuitive robot behavior. The continuous bilateral information and energy exchange requires an appropriate continuous robot feedback. Investigating a cooperative manipulation task, the desired behavior is a combination of an urge to fulfill the task, a smooth instant reactive behavior to human force inputs and an assignment of the task effort to the cooperating agents. In this paper, a formal analysis of human-robot cooperative load transport is presented. Three different possibilities for the assignment of task effort are proposed. Two proposed dynamic role exchange mechanisms adjust the robot's urge to complete the task based on the human feedback. For comparison, a static role allocation strategy not relying on the human agreement feedback is investigated as well. All three role allocation mechanisms are evaluated in a user study that involves large-scale kinesthetic interaction and full-body human motion. Results show tradeoffs between subjective and objective performance measures stating a clear objective advantage of the proposed dynamic role allocation scheme.
In human-computer collaboration involving haptics, a key issue that remains to be solved is to es... more In human-computer collaboration involving haptics, a key issue that remains to be solved is to establish an intuitive communication between the partners. Even though computers are widely used to aid human operators in teleoperation, guidance, and training, since they lack the adaptability, versatility, and awareness of a human, their ability to improve efficiency and effectiveness in dynamic tasks is limited. We suggest that the communication between a human and a computer can be improved if it involves a decision making process in which the computer is programmed to infer the intentions of the human operator and dynamically adjust the control levels of the interacting parties to facilitate a more intuitive interaction setup. In this paper, we investigate the utility of such a dynamic role exchange mechanism where partners negotiate through the haptic channel to trade their control levels on a collaborative task. We examine the energy consumption, the work done on the manipulated object, and the joint efficiency in addition to the task performance. We show that when compared to an equal control condition, a role exchange mechanism improves task performance and the joint efficiency of the partners. We also show that augmenting the system with additional informative visual and vibrotactile cues, which are used to display the state of interaction, allows the users to become aware of the underlying role exchange mechanism and utilize it in favor of the task. These cues also improve the user's sense of interaction and reinforce his/her belief that the computer aids with the execution of the task.
Future touch screen applications will include multiple tactile stimuli displayed simultaneously o... more Future touch screen applications will include multiple tactile stimuli displayed simultaneously or consecutively to single finger or multiple fingers. These applications should be designed by considering human tactile masking mechanism since it is known that presenting one stimulus may interfere with the perception of the other. In this study, we investigate the effect of masking on the tactile perception of electrovibration displayed on touch screens. Through conducting psychophysical experiments with nine subjects, we measured the masked thresholds of sinusoidal electrovibration bursts (125 Hz) under two masking conditions: simultaneous and pedestal. The masking stimuli were noise bursts, applied at five different sensation levels varying from 2 to 22 dB SL, also presented by electrovibration. For each subject, the detection thresholds were elevated as linear functions of masking levels for both masking types. We observed that the masking effectiveness was larger with pedestal masking than simultaneous masking. Moreover, in order to investigate the effect of tactile masking on our haptic perception of edge sharpness, we compared the perceived sharpness of edges separating two textured regions displayed with and without various masking stimuli. Our results suggest that sharpness perception depends on the local contrast between background and foreground stimuli, which varies as a function of masking amplitude and activation levels of frequency-dependent psychophysical channels.
In this study, we investigated the effect of input voltage waveform on our haptic perception of e... more In this study, we investigated the effect of input voltage waveform on our haptic perception of electrovibration on touch screens. Through psychophysical experiments performed with eight subjects, we first measured the detection thresholds of electrovibration stimuli generated by sinusoidal and square voltages at various fundamental frequencies. We observed that the subjects were more sensitive to stimuli generated by square wave voltage than sinusoidal one for frequencies lower than 60 Hz. Using Matlab simulations, we showed that the sensation difference of waveforms in low fundamental frequencies occurred due to the frequency-dependent electrical properties of human skin and human tactile sensitivity. To validate our simulations, we conducted a second experiment with another group of eight subjects. We first actuated the touch screen at the threshold voltages estimated in the first experiment and then measured the contact force and acceleration acting on the index fingers of the subjects moving on the screen with a constant speed. We analyzed the collected data in the frequency domain using the human vibrotactile sensitivity curve. The results suggested that Pacinian channel was the primary psychophysical channel in the detection of the electrovibration stimuli caused by all the square-wave inputs tested in this study. We also observed that the measured force and acceleration data were affected by finger speed in a complex manner suggesting that it may also affect our haptic perception accordingly.
We review the current technology underlying surface haptics that converts passive touch surfaces ... more We review the current technology underlying surface haptics that converts passive touch surfaces to active ones (machine haptics), our perception of tactile stimuli displayed through active touch surfaces (human haptics), their potential applications (human-machine interaction), and finally the challenges ahead of us in making them available through commercial systems. This review primarily covers the tactile interactions of human fingers or hands with surface-haptics displays by focusing on the three most popular actuation methods: vibrotactile, electrostatic, and ultrasonic.
Masking has been used to study human perception of tactile stimuli, including those created by el... more Masking has been used to study human perception of tactile stimuli, including those created by electrovibration on touch screens. Earlier studies have investigated the effect of on-site masking on tactile perception of electrovibration. In this article, we investigated whether it is possible to change the absolute detection threshold and intensity difference threshold of electrovibration at the fingertip of index finger via remote masking, i.e., by applying a (mechanical) vibrotactile stimulus on the proximal phalanx of the same finger. The masking stimuli were generated by a voice coil (the Haptuator). For 16 participants, we first measured the detection thresholds for electrovibration at the fingertip and for vibrotactile stimuli at the proximal phalanx. Then, the vibrations on the skin were measured at four different locations on the index finger of subjects to investigate how the mechanical masking stimulus propagated as the masking level was varied. Later, masked absolute thresholds of eight participants were measured. Finally, for another group of eight participants, intensity difference thresholds were measured in the presence/absence of vibrotactile masking stimuli. Our results show that vibrotactile masking stimuli generated sub-threshold vibrations around the fingertip, and hence, probably did not mechanically interfere with the electrovibration stimulus. However, there was a clear psychophysical masking effect due to central neural processes. We measured the effect of masking stimuli, up to 40 dB SL, on the difference threshold at four different intensity standards of electrovibration. We proposed two models based on hypothetical neural signals for prediction of the masking effect on intensity difference thresholds for electrovibration: amplitude and energy models. The energy model was able to predict the effect of masking more accurately, especially at high intensity masking levels.
Tactile discrimination and roughness perception of real textures are extensively studied and unde... more Tactile discrimination and roughness perception of real textures are extensively studied and underlying perceptual mechanisms are relatively well-established. However, tactile perception of virtual textures rendered by friction modulation techniques on touch surfaces has not been investigated in detail yet. In this study, we investigated our ability to discriminate two consecutive step changes in friction (called edges), followed by discrimination and roughness perception of multiple edges (called periodic gratings). The results showed that discrimination of two consecutive edges was significantly influenced by edge sequence: a step fall in friction (F F) followed by a step rise in friction (RF) was discriminated more easily than the reverse order. On the other hand, periodic gratings displayed by consecutive sequences of F F followed by RF were perceived with the same acuity as compared to vice versa. Independent of the edge sequence, we found that a relative difference of 14% in spatial period was required to discriminate two periodic gratings. Moreover, the roughness perception of periodic gratings decreased with increasing spatial period for the range that we have investigated (spatial period > 2 mm), despite the lack of spatial cues on grating height. We also observed that rate of change in friction coefficient was better correlated with the roughness perception than the friction coefficient itself. These results will further help to understand and design virtual textures for touch surfaces.
IEEE Transactions on Haptics, Vol. 11, No. 4, pp. 599-610, 2018
To render tactile cues on a touchscreen by friction modulation, it is important to understand how... more To render tactile cues on a touchscreen by friction modulation, it is important to understand how human perceive a change in friction. In this study, we investigate the relations between perceived change in friction on an ultrasonically actuated touchscreen and parameters involved in contact between finger and its surface. We first estimate the perceptual thresholds to detect rising and falling friction while finger is sliding on the touch surface. Then, we conduct intensity scaling experiments and investigate the effect of finger sliding velocity, normal force, and rise/fall time of vibration amplitude (transition time) on the perceived intensity of change in friction. In order to better understand the role of contact mechanics, we also look into the correlations between the perceived intensities of subjects and several parameters involved in contact. The results of our experiments show that the contrast and rate of change in tangential force were best correlated with the perceived intensity. The subjects perceived rising friction more strongly than falling friction, particularly at higher tangential force contrast. We argue that this is due to hysteresis and viscoelastic behavior of fingertip under tangential loading. The results also showed that transition time and normal force have significant effect on our tactile perception.
Displaying tactile feedback through a touchscreen via electrovibration has many potential applica... more Displaying tactile feedback through a touchscreen via electrovibration has many potential applications in mobile devices, consumer electronics, home appliances, and automotive industry though our knowledge and understanding on the underlying contact mechanics is very limited. An experimental study was conducted to investigate the contact evolution between the human finger and a touchscreen under electrovibration using a robotic setup and an imaging system. The results show that the effect of electrovibration is only present during full slip but not before slip. Hence, coefficient of friction increases under electrovibration as expected during full slip, but the apparent contact area is significantly smaller during full slip when compared to that of no electrovibration condition. It is suggested that the main cause of the increase in friction during full slip is due to an increase in real contact area and the reduction in apparent area is due to stiffening of the finger skin in tangential direction.
There is a growing interest on touchscreens displaying tactile feedback due to their tremendous p... more There is a growing interest on touchscreens displaying tactile feedback due to their tremendous potential in consumer electronics. In these systems, the friction between user's fingerpad and the surface of touchscreen is modulated to display tactile effects. One of the promising techniques used in this regard is electrostatic actuation. If, for example, an alternating voltage is applied to the conductive layer of a surface capacitive touchscreen, an attractive electrostatic force is generated between finger and the surface which results in an increase in frictional forces acting on the finger moving on the surface. By altering the amplitude, frequency, and the waveform of this signal, a rich set of tactile effects can be generated on the touchscreen. Despite the ease of implementation and its powerful effect on our tactile sensation, the contact mechanics leading to an increase in friction due to electroadhesion has not been fully understood yet. In this paper, we present experimental results for how the friction between a finger and a touchscreen depends on the electrostatic attraction and the applied normal pressure. The dependency of the finger-touchscreen interaction on the applied voltage, and on several other parameters, is also investigated using a mean field theory based on multiscale contact mechanics. We present detailed theoretical analysis of how the area of real contact and the friction force depend on contact parameters, and show that it is possible to further augment the friction force, and hence the tactile feedback displayed to the user by carefully choosing those parameters.
IEEE Transactions on Haptics, Vol. 13, No. 1, pp. 137-143, 2020
Rendering tactile effects on a touch screen via electrovibration has many potential applications.... more Rendering tactile effects on a touch screen via electrovibration has many potential applications. However, our knowledge on tactile perception of change in friction and the underlying contact mechanics are both very limited. In this article, we investigate the tactile perception and the contact mechanics for a step change in friction under electrovibration during a relative sliding between a finger and the surface of a capacitive touch screen. First, we conduct magnitude estimation experiments to investigate the role of normal force and sliding velocity on the perceived tactile intensity for a step increase and decrease in friction, called rising friction (RF) and falling friction (FF). To investigate the contact mechanics involved in RF and FF, we then measure the frictional force, the apparent contact area, and the strains acting on the fingerpad during sliding at a constant velocity under three different normal loads using a custom-made experimental setup. The results show that the participants perceived RF stronger than FF, and both the normal force and sliding velocity significantly influenced their perception. These results are supported by our mechanical measurements; the relative change in friction, the apparent contact area, and the strain in the sliding direction were all higher for RF than those for FF, especially for low normal forces. Taken together, our results suggest that different contact mechanics take place during RF and FF due to the viscoelastic behavior of fingerpad skin, and those differences influence our tactile perception of a step change in friction.
Real-time simulation of deformable objects using finite element models is a challenge in medical ... more Real-time simulation of deformable objects using finite element models is a challenge in medical simulation. We present two efficient methods for simulating real-time behavior of a dynamically deformable 3D object modeled by finite element equations. The first method is based on modal analysis, which utilizes the most significant vibration modes of the object to compute the deformation field in real-time for applied forces. The second method uses the spectral Lanczos decomposition to obtain the explicit solutions of the finite element equations that govern the dynamics of deformations. Both methods rely on modeling approximations, but generate solutions that are computationally faster than the ones obtained through direct numerical integration techniques. In both methods, the errors introduced through approximations were insignificant compare to the computational advantage gained for achieving real-time update rates. 1. Physically-based modeling of deformable objects for medical simulation Simulation of soft tissue behavior in real-time is a challenging problem. Once the contact between an instrument and tissue is determined, the problem centers on tool-tissue interactions. This involves a realistic haptic feedback to the user and a realistic graphical display of tissue behavior depending on what surgical task (e.g. suturing, grasping, cutting, etc.) the user chooses to perform on the tissue. This is a nontrivial problem which calls for prudence in the application of mechanistic and computer graphics techniques in an endeavor to create a make-believe world that is realistic enough to mimic reality but efficient enough to be executable in real time. Soft-tissue mechanics is complicated not only due to non-linearities, rate and time dependence in material behavior, but also because the tissues are layered and non-homogeneous. The finite element methods (FEM), though they demand more CPU time and memory, seem promising in integrating tissue characteristics into the organ models. Although mechanics community has developed sophisticated tissue models based on FEM, their integration with medical simulators has been difficult due to real-time requirements. Simulating the real-time deformable dynamics of a 3D object using FEM is increasingly more difficult as the total number of nodes/degrees of freedom (dof) increase. With the addition of haptic displays, this has been even more challenging since a haptic loop typically requires a much higher update rate than a visual loop for stable force interactions. Although fast finite element models have been developed for medical applications (Bro-Nielsen and Cotin, 1996; Berkley et al., 2000), less attention has been paid to displaying time dependent deformations of large size models in real-time. This paper introduces two numerically fast techniques for real-time simulation of dynamically deformable (i.e. time-dependent deformations) 3D objects modeled by FEM: (a) Modal analysis (Basdogan, 1999; Basdogan et al., 2000) and (b) spectral Lanczos Decomposition. 2. Our Finite Element Model
Realistic simulation of tissue cutting and bleeding is important components of a surgical simulat... more Realistic simulation of tissue cutting and bleeding is important components of a surgical simulator that are addressed in this study. Surgeons use a number of instruments to perform incision and dissection of tissues during minimally invasive surgery. For example, a coagulating hook is used to tear and spread the tissue that surrounds organs and scissors are used to dissect the cystic duct during laparoscopic cholecystectomy. During the execution of these procedures, bleeding may occur and blood flows over the tissue surfaces. We have developed computationally fast algorithms to display (1) tissue cutting and (2) bleeding in virtual environments with applications to laparoscopic surgery. Cutting through soft tissue generates an infinitesimally thin slit until the sides of the surface are separated from each other. Simulation of an incision through tissue surface is modeled in three steps: first, the collisions between the instrument and the tissue surface are detected as the simulated cutting tool passes through. Then, the vertices along the cutting path are duplicated. Finally, a simple elastic tissue model is used to separate the vertices from each other to reveal the cut. Accurate simulation of bleeding is a challenging problem because of the complexities of the circulatory system and the physics of viscous fluid flow. There are several fluid flow models described in the literature, but most of them are computationally slow and do not specifically address the problem of blood flowing over soft tissues. We have reviewed the existing models, and have adapted them to our specific task. The key characteristics of our blood flow model are a visually realistic display and real-time computational performance. To display bleeding in virtual environments, we developed a surface flow algorithm. This method is based on a simplified form of the Navier-Stokes equations governing viscous fluid flow. The simplification of these partial differential equations results in a wave equation that can be solved efficiently, in real-time, with finite difference techniques. The solution describes the flow of blood over the polyhedral surfaces representing the anatomical structures and is displayed as a continuous polyhedral surface drawn over the anatomy.
We have developed a multi-modal virtual environment setup by fusing visual and haptic images thro... more We have developed a multi-modal virtual environment setup by fusing visual and haptic images through the use of a new autostereoscopic display and a force-feedback haptic device. Most of the earlier visualization systems that integrate stereo vision and touch have utilized polarized or shutter glasses for stereovision. In this paper, we discuss the development stages and components of our setup that allows a user to touch, feel, and manipulate virtual objects through a haptic device while seeing them in stereo without using any special eyewear. We also discuss the transformations involved in mapping the absolute coordinates of virtual objects into visual and haptic workspaces and the synchronization of cursor movements in these workspaces. Future applications of this work will include a) multi-modal visualization of planetary data and b) planning of space mission operations in virtual environments. SetUp Our setup is designed to create a multi-modal virtual environment that integrates vision and touch with minimum obstruction to the user. We believe that the next generation of user interfaces for virtual environments will be non-intrusive and more natural. With this aim, we have developed a setup (see Figure 1) that includes a new projection table for autostereoscopic visualization and a PHANToM haptic device (available from Sensable Technologies Inc) for simulating touch interactions. Figure 1. Our setup includes a projection table for stereo visualization of 3D objects without using any special eyewear and a haptic device for force feedback. Our autostereoscopic display system consists of two LCD projectors (one for each eye) and mirrors housed in a rectangular enclosure, topped with a holographic plate (see Figure 2). Each LCD projector reflects its image off a pair of mirrors, casting the image onto the hologram at the tabletop. Our holographic plate is a typical parallax display and allows each eye to see its corresponding image, but not the image meant for the other eye. In order to render two separate images, a graphics card with a dual ported output was used (one output for each projector). The left and right images projected from the LCD projectors on the holographic display were swapped to create a negative horizontal parallax such that a displayed 3D image appears as if it is floating slightly above the holographic plate (see Figure 2).
A virtual reality (VR) toolkit that integrates the human operator into a virtual environment by m... more A virtual reality (VR) toolkit that integrates the human operator into a virtual environment by means of visual and haptic feedback has been developed to design and test manipulation strategies at nano-scale. Currently, the toolkit is capable of modeling the mechanistic interactions between an AFM tip and spherical particles on a substrate surface and generating optimum manipulation paths using a potential field approach. In addition, haptic fixtures were designed to guide the user to follow the calculated paths.
New 3D video representations enable new modalities of interaction, such as haptic interaction, wi... more New 3D video representations enable new modalities of interaction, such as haptic interaction, with 2D and 3D video for truly immersive media applications. Haptic interaction with video includes haptic structure and haptic motion for new immersive experiences. It is possible to compute haptic structure signals from 3D scene geometry or depth information. This paper introduces the concept of haptic motion, as well as new methods to compute haptic structure and motion signals for 2D video-plus-depth representation. The resulting haptic signals can be rendered using a haptic cursor attached to a 2D or 3D video display. Experimental results and a demo system are available.
We investigate how collaborative guidance can be realized in multi-modal virtual environments for... more We investigate how collaborative guidance can be realized in multi-modal virtual environments for dynamic tasks involving motor control. Haptic guidance in our context can be defined as any form of force/tactile feedback that the computer generates to help a user execute a task in a faster, more accurate, and subjectively more pleasing fashion. In particular, we are interested in determining guidance mechanisms that best facilitate task performance and arouse a natural sense of collaboration. We suggest that a haptic guidance system can be further improved if it is supplemented with a role exchange mechanism, which allows the computer to adjust the forces it applies to the user in response to his/her actions. Recent work on collaboration and role exchange presented new perspectives on defining roles and interaction. However existing approaches mainly focus on relatively basic environments where the state of the system can be defined with a few parameters. We designed and implemented a complex and highly dynamic multimodal game for testing our interaction model. Since the state space of our application is complex, role exchange needs to be implemented carefully. We defined a novel negotiation process, which facilitates dynamic communication between the user and the computer, and realizes the exchange of roles using a three-state finite state machine. Our preliminary results indicate that even though the negotiation and role exchange mechanism we adopted does not improve performance in every evaluation criteria, it introduces a more personal and human-like interaction model.
We show that vibrotactile feedback displayed through the steering wheel of a car can reduce the p... more We show that vibrotactile feedback displayed through the steering wheel of a car can reduce the perceptual and cognitive load of the driver, leading to less distraction and fewer navigation errors. To demonstrate the concept, two vibration motors are mounted onto the steering wheel of a driving simulator and driving experiments are performed in virtual environments under two different sensory conditions (auditory alone and auditory and vibrotactile feedback together). The results of our experiments with 12 subjects show that, if passenger auditory noise and distraction exist in the environment, the navigation errors (making a wrong turn or taking a wrong exit) are reduced when vibrotactile feedback is displayed to the users in tandem with the GPS-based voice commands.
In the near future, humans and robots are expected to perform collaborative tasks involving physi... more In the near future, humans and robots are expected to perform collaborative tasks involving physical interaction in various different environments such as homes, hospitals, and factories. One important research topic in physical Human-Robot Interaction (pHRI) is to develop tacit and natural haptic communication between the partners. Although there are already several studies in the area of Human-Robot Interaction, the number of studies investigating the physical interaction between the partners and in particular the haptic communication are limited and the interaction in such systems is still artificial when compared to natural human-human collaboration. Although the tasks involving physical interaction such as the table transportation can be planned and executed naturally and intuitively by two humans, there are unfortunately no robots in the market that can collaborate and perform the same tasks with us. In this study, we propose a new controller for the robotic partner that is designed to a) detect the intentions of the human partner through haptic channel using a fuzzy controller b) adjust its contribution to the task via a variable impedance controller and c) resolve the conflicts during the task execution by controlling the internal forces. The results of the simulations performed in Simulink/Matlab show that the proposed controller is superior to the stand-alone standard/variable impedance controllers.
Uploads
Papers by C. Basdogan