Using partially observed Markov processes to select optimal termination time of TV shows
This paper presents a method for optimal control of a running television show. The problem is formulated as a partially observed Markov decision process (POMDP). A show can be in a "good" state, i.e., it should be continued, or it can be in a "bad" state and therefore it should be changed. The ratings of a show are modeled as a stochastic process that depends on the show's state. An optimal rule for a continue/change decision, which maximizes the expected present value of profits from selling advertising time, is expressed in terms of the prior probability of the show being in the good state. The optimal rule depends on the size of the investment in changing a show, the difference in revenues between a "good" and a "bad" show and the number of time periods remaining until the end of the planning horizon. The application of the method is illustrated with simulated ratings as well as real data.
Year of publication: |
2008
|
---|---|
Authors: | Givon, Moshe ; Grosfeld-Nir, Abraham |
Published in: |
Omega. - Elsevier, ISSN 0305-0483. - Vol. 36.2008, 3, p. 477-485
|
Publisher: |
Elsevier |
Keywords: | Dynamic programming Markov chain POMDP TV shows Planning and control Simulation |
Saved in:
Saved in favorites
Similar items by person
-
Using partially observed Markov processes to select optimal termination time of TV shows
Givon, Moshe, (2008)
-
Using partially observed Marko processes to select optimal termination time of TV shows
Givon, Moshe, (2008)
-
Taste tests : changing the rules to improve the game
Givon, Moshe, (1989)
- More ...