Abstract: From high fidelity field exercises to disaster response deployments, search and rescue robots are being readily integrated into rescue operations. Previous research has proposed that for such new technology to be successful in an operation the organization architecture needs to support the coordination of shared perspectives between the human team members and the robotic platforms. For this, the robot platforms need to be effective team players in the field of practice. Based on this conceptual model, this paper introduces a novel software interface utilizing virtual position and orientation indicators to alleviate perceptual ambiguities and navigation problems experienced by robot handlers and problem holders. By actively orchestrating and sharing these indicators between handler and operator displays, the interface caters to user expertise and to the natural competency of the human perceptual system. These probes provide a basic tool for aiding robot navigation and way-finding fundamental to effective team coordination and communication in urban search and rescue missions.