0

I am using Python 2.7 with naoqi to make the robot grab an object with both hands (either from the bottom or from two sides).

As a first step I want to move an effector to one side of a perceived target object and use cartesian control API to apply the movement to the effector.

I first show a target object to Nao in order to get its x,y,z position like so:

tracker_service= session.service("ALTracker")
xyz_pos = tracker_service.getTargetPosition(motion.FRAME_TORSO)

Then, and in attempt to move the effector to a point neaby the the object I do:

        effector = "RArm"
       
        frame = motion.FRAME_TORSO
        effector_offset = almath.Transform(self.motion.getTransform(effector, frame, False))
        effector_init_3d_position = almath.position3DFromTransform(
            effector_offset)
        
        target_3d_position = almath.Position3D(target_position)
        move_3d = target_3d_position - effector_init_3d_position 
        moveTransform = almath.Transform.fromPosition(move_3d.x, move_3d.y, move_3d.z)
        target_transformer_list = list(moveTransform.toVector())
        times = [2.0]  
        axis_mask_list = motion.AXIS_MASK_VEL
        self.motion.transformInterpolations(effector, frame, target_transformer_list, axis_mask_list, times).

If the move_3d has values such as [0.09,0.05,0.08] the arm moves otherwise it gets confused.

Now the problem is that the xyz_pos is of the magnitude of [0.8,0.012,0.9] eventhough I show the object in from of the torso of the robot (slightly below its head) The values don't make sense to me. My question why the target object position is not coherent and how to apply the movement in a correct way (assuming my code to move the arm is correct)

Thanks.

0

Browse other questions tagged or ask your own question.