Sciweavers

CAINE
2003

POMDP Planning for High Level UAV Decisions: Search vs. Strike

14 years 25 days ago
POMDP Planning for High Level UAV Decisions: Search vs. Strike
The Partially Observable Markov Decision Process (POMDP) model is explored for high level decision making for Unmanned Air Vehicles (UAVs). The type of UAV modeled is a flying munition with a limited fuel supply. The UAV is destroyed when it strikes a target. When a UAV detects a target, a decision has to be made whether to continue to search for a better target or strike the current target immediately. Many factors influence this decision, including target value, target density, threat levels, and fuel level. POMDP is a suitable model for this battle field situation because of uncertainties due to the stochastic nature of the problem and the imperfect sensors of the UAV. Two POMDP models are presented in this paper. One uses planning horizon to model the fuel level, while the other models the fuel level explicitly in the states.
Doug Schesvold, Jingpeng Tang, Benzir Md Ahmed, Ka
Added 31 Oct 2010
Updated 31 Oct 2010
Type Conference
Year 2003
Where CAINE
Authors Doug Schesvold, Jingpeng Tang, Benzir Md Ahmed, Karl Altenburg, Kendall E. Nygard
Comments (0)