Can a patient who has lost the sense of sight learn to recognize a tactile representation of a work of visual art? If so, do any similarities exist between the touch tracks such a patient's fingers follow when tactilely "viewing" such a representation, and the eye tracks a sighted person follows when viewing the corresponding visual image? Cognitive neuroscientists exploring these questions require a means to compute and record accurately in real time the time-varying positions of a patient's fingers during tactile "viewing." This thesis project comprised the design, coding, and testing of a computer-based system fulfilling that requirement.