Intrinsic Motivation
Summary
Intrinsic motivation in AI research focuses on developing methods for agents to autonomously explore and learn about their environments without relying on external rewards. Two key approaches in this area are empowerment maximization and skill discovery. Empowerment, defined as the maximum mutual information between an agent’s actions and resulting environmental states, quantifies the agent’s influence on its surroundings and serves as an intrinsic motivator. Recent work has developed efficient methods to estimate empowerment in unknown dynamics using visual observations and latent space representations. Meanwhile, the Dynamics-Aware Discovery of Skills (DADS) algorithm combines model-based and model-free learning to discover predictable behaviors and their dynamics simultaneously. This approach enables zero-shot planning in learned latent spaces and outperforms standard reinforcement learning methods in various tasks, including those with sparse rewards. Both approaches contribute to the development of more autonomous and adaptable AI agents by leveraging information-theoretic principles and unsupervised learning techniques.