EconBiz - Find Economic Literature
    • Logout
    • Change account settings
  • A-Z
  • Beta
  • About EconBiz
  • News
  • Thesaurus (STW)
  • Academic Skills
  • Help
  •  My account 
    • Logout
    • Change account settings
  • Login
EconBiz - Find Economic Literature
Publications Events
Search options
Advanced Search history
My EconBiz
Favorites Loans Reservations Fines
    You are here:
  • Home
  • Search: subject:"expected discounted total rewards"
Narrow search

Narrow search

Year of publication
Subject
All
Markov decision processes 2 decomposing the state space 2 eliminating actions 2 expected discounted total rewards 2 optimality equation 2
Type of publication
All
Article 2
Language
All
Undetermined 2
Author
All
Hu, Qiying 2 Xu, Chen 2
Published in...
All
Computational Statistics 1 Mathematical Methods of Operations Research 1
Source
All
RePEc 2
Showing 1 - 2 of 2
Cover Image
The finiteness of the reward function and the optimal value function in Markov decision processes
Hu, Qiying; Xu, Chen - In: Mathematical Methods of Operations Research 49 (1999) 2, pp. 255-266
This paper studies the discrete time Markov decision processes (MDP) with expected discounted total reward, where the state space is countable, the action space is measurable, the reward function is extended real-valued, and the discount rate may be any real number. Two conditions (GC) and (C)...
Persistent link: https://www.econbiz.de/10010999967
Saved in:
Cover Image
The finiteness of the reward function and the optimal value function in Markov decision processes
Hu, Qiying; Xu, Chen - In: Computational Statistics 49 (1999) 2, pp. 255-266
This paper studies the discrete time Markov decision processes (MDP) with expected discounted total reward, where the state space is countable, the action space is measurable, the reward function is extended real-valued, and the discount rate may be any real number. Two conditions (GC) and (C)...
Persistent link: https://www.econbiz.de/10010847961
Saved in:
A service of the
zbw
  • Sitemap
  • Plain language
  • Accessibility
  • Contact us
  • Imprint
  • Privacy

Loading...