![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
Showing 1 - 4 of 4 matches in All Departments
One of the major concerns of theoretical computer science is the classifi cation of problems in terms of how hard they are. The natural measure of difficulty of a function is the amount of time needed to compute it (as a function of the length of the input). Other resources, such as space, have also been considered. In recursion theory, by contrast, a function is considered to be easy to compute if there exists some algorithm that computes it. We wish to classify functions that are hard, i.e., not computable, in a quantitative way. We cannot use time or space, since the functions are not even computable. We cannot use Turing degree, since this notion is not quantitative. Hence we need a new notion of complexity-much like time or spac that is quantitative and yet in some way captures the level of difficulty (such as the Turing degree) of a function."
1 Introduction.- 2 Continuous-Time Quadratic Guaranteed Cost Filtering.- 3 Discrete-Time Quadratic Guaranteed Cost Filtering.- 4 Continuous-Time Set-Valued State Estimation and Model Validation.- 5 Discrete-Time Set-Valued State Estimation.- 6 Robust State Estimation with Discrete and Continuous Measurements.- 7 Set-Valued State Estimation with Structured Uncertainty.- 8 Robust H? Filtering with Structured Uncertainty.- 9 Robust Fixed Order H? Filtering.- 10 Set-Valued State Estimation for Nonlinear Uncertain Systems.- 11 Robust Filtering Applied to Induction Motor Control.- References.
One of the major concerns of theoretical computer science is the classifi cation of problems in terms of how hard they are. The natural measure of difficulty of a function is the amount of time needed to compute it (as a function of the length of the input). Other resources, such as space, have also been considered. In recursion theory, by contrast, a function is considered to be easy to compute if there exists some algorithm that computes it. We wish to classify functions that are hard, i.e., not computable, in a quantitative way. We cannot use time or space, since the functions are not even computable. We cannot use Turing degree, since this notion is not quantitative. Hence we need a new notion of complexity-much like time or spac that is quantitative and yet in some way captures the level of difficulty (such as the Turing degree) of a function."
The publication of this book opens new vistas for both the author and the reader in the field of science fiction. It takes established truth, gives it a "what if"twist and presents provocative, unexpected conclusions. In the story "DEEPWATER" we come face to face with an ancient mystery that seems to defy logic. "The Waves" takes us back in time to the Biblical flood in manner that is both exciting and sobering. It helps us to reflect on our own humanity. "Mount Deceit" is a thriller that surprisingly comforts us as it winds to its conclusion. "Charlie the Pack Rat" and "EL CHAMUO" take us into the unusual and downright scary while showing man's initial reaction to things they do not understand.
|
You may like...
New Insights Into Operations Research…
Courtney Hoover
Hardcover
Petri Net Synthesis for Discrete Event…
MengChu Zhou, F. DiCesare
Hardcover
R4,145
Discovery Miles 41 450
|