In this study, we present advances on the development of proactive control for online individual user adaptation in a welfare robot guidance scenario, with the integration of three main modules: navigation control, visual human detection, and temporal error correlation-based neural learning. The proposed control approach can drive a mobile robot to autonomously navigate in relevant indoor environments. At the same time, it can predict human walking speed based on visual information without prior knowledge of personality and preferences (i.e., walking speed).