A growing number of medical datasets now contain both a spatial and a temporal dimension. Trajectories, from tools or body features, are thus becoming increasingly important for their analysis. In this paper, we are interested in recovering the spatial and temporal differences between trajectories coming from different datasets. In particular, we address the case of surgical gestures, where trajectories contain both spatial transformations and speed differences in the execution. We first define the spatio-temporal registration problem between multiple trajectories. We then propose an optimization method to jointly recover both the rigid spatial motions and the non-linear time warpings. The optimization generates also a generic trajectory template, in which spatial and temporal differences have been factored out. This approach can be potentially used to register and compare gestures side-by-side for training sessions, to build gesture trajectory models for automation by a robot, or to register the trajectories of natural or artificial markers which follow similar motions. We demonstrate its usefulness with synthetic and real experiments. In particular, we register and analyze complex surgical gestures performed by tele-manipulation using the da Vinci robot.