![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Computer programming
This volume describes and analyzes in a systematic way the great contributions of the philosopher Krister Segerberg to the study of real and doxastic actions. Following an introduction which functions as a roadmap to Segerberg's works on actions, the first part of the book covers relations between actions, intentions and routines, dynamic logic as a theory of action, agency, and deontic logics built upon the logics of actions. The second section explores belief revision and update, iterated and irrevocable beliefs change, dynamic doxastic logic and hypertheories. Segerberg has worked for more than thirty years to analyze the intricacies of real and doxastic actions using formal tools - mostly modal (dynamic) logic and its semantics. He has had such a significant impact on modal logic that "It is hard to roam for long in modal logic without finding Krister Segerberg's traces," as Johan van Benthem notes in his chapter of this book.
The purpose of the 11th International Conference on Software Engineering Research, Management and Applications (SERA 2013) held on August 7 - 9, 2012 in Prague, Czech Republic was to bring together scientists, engineers, computer users, and students to share their experiences and exchange new ideas and research results about all aspects (theory, applications and tools) of Software Engineering Research, Management and Applications, and to discuss the practical challenges encountered along the way and the solutions adopted to solve them. The conference organizers selected 17 outstanding papers from those papers accepted for presentation at the conference in order to publish them in this volume. The papers were chosen based on review scores submitted by members of the program committee, and further rigorous rounds of review.
R/3 is a business system that has gained global prominence. However, the SAP R/3 has 237,000 function modules. Quite oftenprogrammersare unaware that a module exists which can be of help in their programs. This convenient resource is a collection of the most common ABAP modules, demonstrated within simple programs. These programs for easily searchable examples can be accessed from http: //extras.springer.com/978-1-85233-775-9 The modules in this book are organised for quick reference. This concise reference contains: A full explanation of the layout of reference entries;a brief introduction to SAP; coverage of conversion and date and time modules; file and directory modules; list, long texts, and number modules; useful integration modules for MSOffice and pop-up dialog box management. This book organises over 300 modules, many of which are undocumented in text, and arranges them for quick and easy reference, and explains when and where to use the most common SAP R/3 ABAP function modules. "
From a review of the Second Edition
This book deals with the theory and applications of the Reformulation- Linearization/Convexification Technique (RL T) for solving nonconvex optimization problems. A unified treatment of discrete and continuous nonconvex programming problems is presented using this approach. In essence, the bridge between these two types of nonconvexities is made via a polynomial representation of discrete constraints. For example, the binariness on a 0-1 variable x . can be equivalently J expressed as the polynomial constraint x . (1-x . ) = 0. The motivation for this book is J J the role of tight linear/convex programming representations or relaxations in solving such discrete and continuous nonconvex programming problems. The principal thrust is to commence with a model that affords a useful representation and structure, and then to further strengthen this representation through automatic reformulation and constraint generation techniques. As mentioned above, the focal point of this book is the development and application of RL T for use as an automatic reformulation procedure, and also, to generate strong valid inequalities. The RLT operates in two phases. In the Reformulation Phase, certain types of additional implied polynomial constraints, that include the aforementioned constraints in the case of binary variables, are appended to the problem. The resulting problem is subsequently linearized, except that certain convex constraints are sometimes retained in XV particular special cases, in the Linearization/Convexijication Phase. This is done via the definition of suitable new variables to replace each distinct variable-product term. The higher dimensional representation yields a linear (or convex) programming relaxation.
This book investigates the design of compilers for procedural languages, based on the algebraic laws which these languages satisfy. The particular strategy adopted is to reduce an arbitrary source program to a general normal form, capable of representing an arbitrary target machine. This is achieved by a series of normal form reduction theorems which are proved algebraically from the more basic laws. The normal form and the related reduction theorems can then be instantiated to design compilers for distinct target machines. This constitutes the main novelty of the author's approach to compilation, together with the fact that the entire process is formalised within a single and uniform semantic framework of a procedural language and its algberaic laws. Furthermore, by mechanising the approach using the OBJ3 term rewriting system it is shown that a prototype compiler is developed as a byproduct of its own proof of correctness.
Process Technology brings together in one place important contributions and up-to-date research results in this fast moving area. Process Technology serves as an excellent reference, providing insight into some of the most challenging research issues in the field.
This volume contains a collection of research and survey papers written by some of the most eminent mathematicians in the international community and is dedicated to Helmut Maier, whose own research has been groundbreaking and deeply influential to the field. Specific emphasis is given to topics regarding exponential and trigonometric sums and their behavior in short intervals, anatomy of integers and cyclotomic polynomials, small gaps in sequences of sifted prime numbers, oscillation theorems for primes in arithmetic progressions, inequalities related to the distribution of primes in short intervals, the Moebius function, Euler's totient function, the Riemann zeta function and the Riemann Hypothesis. Graduate students, research mathematicians, as well as computer scientists and engineers who are interested in pure and interdisciplinary research, will find this volume a useful resource. Contributors to this volume: Bill Allombert, Levent Alpoge, Nadine Amersi, Yuri Bilu, Regis de la Breteche, Christian Elsholtz, John B. Friedlander, Kevin Ford, Daniel A. Goldston, Steven M. Gonek, Andrew Granville, Adam J. Harper, Glyn Harman, D. R. Heath-Brown, Aleksandar Ivic, Geoffrey Iyer, Jerzy Kaczorowski, Daniel M. Kane, Sergei Konyagin, Dimitris Koukoulopoulos, Michel L. Lapidus, Oleg Lazarev, Andrew H. Ledoan, Robert J. Lemke Oliver, Florian Luca, James Maynard, Steven J. Miller, Hugh L. Montgomery, Melvyn B. Nathanson, Ashkan Nikeghbali, Alberto Perelli, Amalia Pizarro-Madariaga, Janos Pintz, Paul Pollack, Carl Pomerance, Michael Th. Rassias, Maksym Radziwill, Joel Rivat, Andras Sarkoezy, Jeffrey Shallit, Terence Tao, Gerald Tenenbaum, Laszlo Toth, Tamar Ziegler, Liyang Zhang.
100 Go Mistakes: How to Avoid Them introduces dozens of techniques for writing idiomatic, expressive, and efficient Go code that avoids common pitfalls. By reviewing dozens of interesting, readable examples and real-world case studies, you'll explore mistakes that even experienced Go programmers make. This book is focused on pure Go code, with standards you can apply to any kind of project. As you go, you'll navigate the tricky bits of handling JSON data and HTTP services, discover best practices for Go code organization, and learn how to use slices efficiently. Your code speed and quality will enjoy a huge boost when you improve your concurrency skills, deal with error management idiomatically, and increase the quality of your tests. About the Technology Go is simple to learn, yet hard to master. Even experienced Go developers may end up introducing bugs and inefficiencies into their code. This book accelerates your understanding of Go's quirks, helping you correct mistakes and dodge pitfalls on your path to Go mastery.
'Visual Languages for Interactive Computing' presents problems and methodologies related to the syntax, semantics, and ambiguities of visual languages.
Object-Z is an object-oriented extension of the formal specification language Z. It adds to Z notions of classes and objects, and inheritance and polymorphism. By extending Z's semantic basis, it enables the specification of systems as collections of independent objects in which self and mutual referencing are possible. The Object-Z Specification Language presents a comprehensive description of Object-Z including discussions of semantic issues, definitions of all language constructs, type rules and other rules of usage, specification guidelines, and a full concrete syntax. It will enable you to confidently construct Object-Z specifications and is intended as a reference manual to keep by your side as you use and learn to use Object-Z. The Object-Z Specification Language is suitable as a textbook or as a secondary text for a graduate-level course, and as a reference for researchers and practitioners in industry.
This book deals with decision making in environments of significant data un certainty, with particular emphasis on operations and production management applications. For such environments, we suggest the use of the robustness ap proach to decision making, which assumes inadequate knowledge of the decision maker about the random state of nature and develops a decision that hedges against the worst contingency that may arise. The main motivating factors for a decision maker to use the robustness approach are: * It does not ignore uncertainty and takes a proactive step in response to the fact that forecasted values of uncertain parameters will not occur in most environments; * It applies to decisions of unique, non-repetitive nature, which are common in many fast and dynamically changing environments; * It accounts for the risk averse nature of decision makers; and * It recognizes that even though decision environments are fraught with data uncertainties, decisions are evaluated ex post with the realized data. For all of the above reasons, robust decisions are dear to the heart of opera tional decision makers. This book takes a giant first step in presenting decision support tools and solution methods for generating robust decisions in a variety of interesting application environments. Robust Discrete Optimization is a comprehensive mathematical programming framework for robust decision making.
3D Character Development Workshop is designed to fast-track comprehension of the concepts, tools, and methods of character rigging so that you can get past the technical hurdles and on to animating! This comprehensive guide is simple enough for non-technical artists to follow, yet presented in a holistic, comprehensive, best-practices approach so professional and student animators and artists can begin designing and animating their own fully-functioning characters.
Speech Dereverberation gathers together an overview, a mathematical formulation of the problem and the state-of-the-art solutions for dereverberation. Speech Dereverberation presents current approaches to the problem of reverberation. It provides a review of topics in room acoustics and also describes performance measures for dereverberation. The algorithms are then explained with mathematical analysis and examples that enable the reader to see the strengths and weaknesses of the various techniques, as well as giving an understanding of the questions still to be addressed. Techniques rooted in speech enhancement are included, in addition to a treatment of multichannel blind acoustic system identification and inversion. The TRINICON framework is shown in the context of dereverberation to be a generalization of the signal processing for a range of analysis and enhancement techniques. Speech Dereverberation is suitable for students at masters and doctoral level, as well as established researchers.
This indispensable text introduces the foundations of three-dimensional computer vision and describes recent contributions to the field. Fully revised and updated, this much-anticipated new edition reviews a range of triangulation-based methods, including linear and bundle adjustment based approaches to scene reconstruction and camera calibration, stereo vision, point cloud segmentation, and pose estimation of rigid, articulated, and flexible objects. Also covered are intensity-based techniques that evaluate the pixel grey values in the image to infer three-dimensional scene structure, and point spread function based approaches that exploit the effect of the optical system. The text shows how methods which integrate these concepts are able to increase reconstruction accuracy and robustness, describing applications in industrial quality inspection and metrology, human-robot interaction, and remote sensing.
A function is convex if its epigraph is convex. This geometrical structure has very strong implications in terms of continuity and differentiability. Separation theorems lead to optimality conditions and duality for convex problems. A function is quasiconvex if its lower level sets are convex. Here again, the geo metrical structure of the level sets implies some continuity and differentiability properties for quasiconvex functions. Optimality conditions and duality can be derived for optimization problems involving such functions as well. Over a period of about fifty years, quasiconvex and other generalized convex functions have been considered in a variety of fields including economies, man agement science, engineering, probability and applied sciences in accordance with the need of particular applications. During the last twenty-five years, an increase of research activities in this field has been witnessed. More recently generalized monotonicity of maps has been studied. It relates to generalized convexity off unctions as monotonicity relates to convexity. Generalized monotonicity plays a role in variational inequality problems, complementarity problems and more generally, in equilibrium prob lems."
Nonsmooth energy functions govern phenomena which occur frequently in nature and in all areas of life. They constitute a fascinating subject in mathematics and permit the rational understanding of yet unsolved or partially solved questions in mechanics, engineering and economics. This is the first book to provide a complete and rigorous presentation of the quasidifferentiability approach to nonconvex, possibly nonsmooth, energy functions, of the derivation and study of the corresponding variational expressions in mechanics, engineering and economics, and of their numerical treatment. The new variational formulations derived are illustrated by many interesting numerical problems. The techniques presented will permit the reader to check any solution obtained by other heuristic techniques for nonconvex, nonsmooth energy problems. A civil, mechanical or aeronautical engineer can find in the book the only existing mathematically sound technique for the formulation and study of nonconvex, nonsmooth energy problems. Audience: The book will be of interest to pure and applied mathematicians, physicists, researchers in mechanics, civil, mechanical and aeronautical engineers, structural analysts and software developers. It is also suitable for graduate courses in nonlinear mechanics, nonsmooth analysis, applied optimization, control, calculus of variations and computational mechanics.
The emergence of Web 2.0 is provoking challenging questions for
developers: What products and services can our company provide to
customers and employees using Rich Internet Applications, mash-ups,
Web feeds or Ajax? Which business models are appropriate and how do
we implement them? What are best practices and how do we apply
them?
Since its establishment in 1998, Microsoft Research Asia's trademark and long term commitment has been to foster innovative research and advanced education in the Asia-Pacific region. Through open collaboration and partnership with universities, government and other academic partners, MSRA has been consistently advancing the state-of-the-art in computer science. This book was compiled to record these outstanding collaborations, as Microsoft Research Asia celebrates its 10th Anniversary. The selected papers are all authored or co-authored by faculty members or students through collaboration with MSRA lab researchers, or with the financial support of MSRA. Papers previously published in top-tier international conference proceedings and journals are compiled here into one accessible volume of outstanding research. Innovation Together highlights the outstanding work of Microsoft Research Asia as it celebrates ten years of achievement and looks forward to the next decade of success.
Industrial development of software systems needs to be guided by recognized engineering principles. Commercial-off-the-shelf (COTS) components enable the systematic and cost-effective reuse of prefabricated tested parts, a characteristic approach of mature engineering disciplines. This reuse necessitates a thorough test of these components to make sure that each works as specified in a real context. Beydeda and Gruhn invited leading researchers in the area of component testing to contribute to this monograph, which covers all related aspects from testing components in a context-independent manner through testing components in the context of a specific system to testing complete systems built from different components. The authors take the viewpoints of both component developers and component users, and their contributions encompass functional requirements such as correctness and functionality compliance as well as non-functional requirements like performance and robustness. Overall this monograph offers researchers, graduate students and advanced professionals a unique and comprehensive overview of the state of the art in testing COTS components and COTS-based systems.
The Verilog hardware description language provides the ability to describe digital and analog systems for design concepts and implementation. It was developed originally at Gateway Design and implemented there. Now it is an open standard of IEEE and Open Verilog International and is supported by many tools and processes. The Complete Verilog Book introduces the language and describes it in a comprehensive manner. In The Complete Verilog Book, each feature of the language is described using semantic introduction, syntax and examples. A chapter on semantics explains the basic concepts and algorithms that form the basis of every evaluation and every sequence of evaluations that ultimately provides the meaning or full semantics of the language. The Complete Verilog Book takes the approach that Verilog is not only a simulation language or a synthesis language or a formal method of describing design, but is a totality of all these and covers many aspects not covered before but which are essential parts of any design process using Verilog. The Complete Verilog Book starts with a tutorial introduction. It explains the data types in Verilog HDL, as the object-oriented world knows that the language-constructs and data types are equally important parts of a language. The Complete Verilog Book explains the three views, behavioral, RTL and structural and then describes features in each of these views. The Complete Verilog Book keeps the reader abreast of current developments in the Verilog world such as Verilog-A, cycle simulation, SD, and DCL, and uses IEEE 1364 syntax. The Complete Verilog Book will be useful to all those who want to learn Verilog HDL and to explore its various facets.
The evolutionary approach called scatter search originated from strategies for creating composite decision rules and surrogate constraints. Recent studies demonstrate the practical advantages of this approach for solving a diverse array of optimization problems from both classical and real world settings. Scatter search contrasts with other evolutionary procedures, such as genetic algorithms, by providing unifying principles for joining solutions based on generalized path constructions in Euclidean space and by utilizing strategic designs where other approaches resort to randomization. The book's goal is to provide the basic principles and fundamental ideas that will allow the readers to create successful applications of scatter search. The book includes the C source code of the methods introduced in each chapter. From the Foreword:
Evolutionary Algorithms for Embedded System Design describes how Evolutionary Algorithm (EA) concepts can be applied to circuit and system design - an area where time-to-market demands are critical. EAs create an interesting alternative to other approaches since they can be scaled with the problem size and can be easily run on parallel computer systems. This book presents several successful EA techniques and shows how they can be applied at different levels of the design process. Starting on a high-level abstraction, where software components are dominant, several optimization steps are demonstrated, including DSP code optimization and test generation. Throughout the book, EAs are tested on real-world applications and on large problem instances. For each application the main criteria for the successful application in the corresponding domain are discussed. In addition, contributions from leading international researchers provide the reader with a variety of perspectives, including a special focus on the combination of EAs with problem specific heuristics. Evolutionary Algorithms for Embedded System Design is an excellent reference for both practitioners working in the area of circuit and system design and for researchers in the field of evolutionary concepts.
|
You may like...
Semantic Web for the Working Ontologist…
James Hendler, Fabien Gandon, …
Hardcover
R2,089
Discovery Miles 20 890
XML in Data Management - Understanding…
Peter Aiken, M. David Allen
Paperback
R1,150
Discovery Miles 11 500
|