Fast, accurate and satisfying text entry remains a significant challenge on touch screens. The lack of tactile feedback from physical keys and the loss of distinction between touching and pressing on touch screen keyboards are two of many challenges. The challenges increase for mobile touch screen text entry, where small screens and walking-induced situational impairments compromise accuracy. In this talk, I will present a study of “touch-typing on flat glass” to understand finger-strike patterns for touch screen keyboards. I will also describe an adaptive keyboard built for Microsoft Surface that morphs its key layout to remain positioned beneath users’ fingers. I will also show a way to incorporate stroke gestures for non-alphanumeric input into this keyboard. For mobile text entry, I will describe WalkType, a keyboard made more accurate while walking by incorporating accelerometer data and inference about users’ walking behavior. Finally, I will describe Perkinput, a Perkins Brailler-based method for eyes-free text entry using Braille-like patterns. Taken together, these projects highlight the potential for effective touch screen text input, and point to future possibilities where exciting work remains.