Ensuring the safety of autonomous systems requires certification methods that can keep pace with systems that learn, update, and operate in changing environments. Researchers at The University of Texas at Austin and collaborating institutions are developing new frameworks that move beyond traditional static certification approaches. The multi-university effort integrates expertise in controls, formal methods, machine learning, human factors, robotics, and systems engineering to study scalable methods for assuring safety in real-world autonomous systems.