Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

Hierarchical Episodic Control

Version 1 : Received: 30 August 2023 / Approved: 31 August 2023 / Online: 31 August 2023 (09:39:49 CEST)

A peer-reviewed article of this Preprint also exists.

Zhou, R.; Zhang, Z.; Wang, Y. Hierarchical Episodic Control. Appl. Sci. 2023, 13, 11544. Zhou, R.; Zhang, Z.; Wang, Y. Hierarchical Episodic Control. Appl. Sci. 2023, 13, 11544.

Abstract

Deep reinforcement learning is one of the research hotspots in artificial intelligence and has been successfully applied in many research areas, however, the low training efficiency and high demand for samples are problems that limit the application To address these problems, a hierarchical episodic control model extending episodic memory to the domain of hierarchical reinforcement learning is proposed in this paper. The model is theoretically justified and employs a hierarchical implicit memory planning approach for counterfactual trajectory value estimation. Starting from the final step and recursively moving back along the trajectory, a hidden plan is formed within the episodic memory. Experience is aggregated both along trajectories and across trajectories, and the model is updated using a multi-headed backpropagation similar to bootstrapped neural networks. This model extends the parameterized episodic memory framework to the realm of hierarchical reinforcement learning and is theoretically analyzed to demonstrate its convergence and effectiveness. Experiments conducted in Four Room, Mujoco, and UE4-based active tracking , highlight that the hierarchical episodic control model effectively enhances training efficiency. It demonstrates notable improvements in both low-dimensional and high-dimensional environments, even in cases of sparse rewards.

Keywords

episodic memory; deep reinforcement learning; hierarchical reinforcement learning

Subject

Computer Science and Mathematics, Artificial Intelligence and Machine Learning

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.