0
Your cart

Your cart is empty

Books > Science & Mathematics > Mathematics > Applied mathematics > Stochastics

Buy Now

Markov Decision Processes with Their Applications (Hardcover) Loot Price: R3,210
Discovery Miles 32 100
Markov Decision Processes with Their Applications (Hardcover): Qiying Hu, Wuyi Yue

Markov Decision Processes with Their Applications (Hardcover)

Qiying Hu, Wuyi Yue

Series: Advances in Mechanics and Mathematics, 14

 (sign in to rate)
Loot Price R3,210 Discovery Miles 32 100 | Repayment Terms: R301 pm x 12*

Bookmark and Share

Expected to ship within 10 - 15 working days

Markov decision processes (MDPs), also called stochastic dynamic programming, were first studied in the 1960s. MDPs can be used to model and solve dynamic decision-making problems that are multiperiod and in stochastic circumstances. There are three basic branches in MDPs: discrete-time MDPs, continuous-time MDPs and semi-Markov decision processes. Starting from these three branches, many generalized MDPs models have been applied to various practical problems. These models include partially observable MDPs, adaptive MDPs, MDPs in stochastic environments, and MDPs with multiple objectives, constraints or imprecise parameters. MDPs have been applied in many areas, such as communications, signal processing, artificial intelligence, stochastic scheduling and manufacturing systems, discrete event systems, management and economies. This book examines MDPs and their applications in the optimal control of discrete event systems (DESs), optimal replacement, and optimal allocations in sequential online auctions.The book presents three main topics: a new methodology for MDPs with discounted total reward criterion; transformation of continuous-time MDPs and semi-Markov decision processes into a discrete-time MDPs model, thereby simplifying the application of MDPs; application of MDPs in stochastic environments, which greatly extends the area where MDPs can be applied. Each topic is used to study optimal control problems or other types of problems.

General

Imprint: Springer-Verlag New York
Country of origin: United States
Series: Advances in Mechanics and Mathematics, 14
Release date: November 2007
First published: 2008
Authors: Qiying Hu • Wuyi Yue
Dimensions: 235 x 155 x 23mm (L x W x T)
Format: Hardcover
Pages: 297
ISBN-13: 978-0-387-36950-1
Categories: Books > Science & Mathematics > Mathematics > Applied mathematics > Stochastics
Promotions
LSN: 0-387-36950-3
Barcode: 9780387369501

Is the information for this product incomplete, wrong or inappropriate? Let us know about it.

Does this product have an incorrect or missing image? Send us a new image.

Is this product missing categories? Add more categories.

Review This Product

No reviews yet - be the first to create one!

You might also like..

Best Books gegradeerde leesreeks: Vlak 1…
Best Books Paperback R90 R85 Discovery Miles 850
bundle available
Stochastic Processes and Their…
Christo Ananth, N. Anbazhagan, … Hardcover R7,630 Discovery Miles 76 300
Special Functions Of Fractional…
Trifce Sandev, Alexander Iomin Hardcover R2,709 Discovery Miles 27 090
Stochastic Komatu-loewner Evolutions
Zhen-Qing Chen, Masatoshi Fukushima, … Hardcover R2,698 Discovery Miles 26 980
Advancements in Bayesian Methods and…
Alastair G Young, Arni S.R. Srinivasa Rao, … Hardcover R7,475 Discovery Miles 74 750
Hidden Link Prediction in Stochastic…
Babita Pandey, Aditya Khamparia Hardcover R5,523 Discovery Miles 55 230
Stochastic Partial Differential…
Ciprian A. Tudor Hardcover R2,167 Discovery Miles 21 670
Geometry and Statistics, Volume 46
Frank Nielsen, Arni S.R. Srinivasa Rao, … Hardcover R7,454 Discovery Miles 74 540
Stochastic Models of Financial…
Vigirdas Mackevicius Hardcover R3,645 Discovery Miles 36 450
Optimal Input Signals for Parameter…
Ewaryst Rafajlowicz Hardcover R3,881 Discovery Miles 38 810
Informal Introduction To Stochastic…
Ovidiu Calin Hardcover R2,599 Discovery Miles 25 990
Informal Introduction To Stochastic…
Ovidiu Calin Paperback R1,351 Discovery Miles 13 510

See more

Partners