With the advent of wearable eye-tracking glasses and Google glasses, monitoring of human visual attention will soon become ubiquitous. The presented video describes the precise estimation of human gaze fixations – from eye tracking glasses (SMI) – with respect to its environment, without the need of artificial landmarks in the field of view, and being capable of providing attention mapping onto 3D information. It enables full 3D recovery of the human view frustum and the gaze pointer in a previously acquired 3D model of the environment in real time. The key contribution is that our methodology enables mapping of fixations directly into an automatically computed 3d model. This innovative methodology will open new opportunities for human attention studies during interaction with its environment, bringing new potential into automated processing for human factors technologies.