![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Computer programming > General
This monograph book is focused on the recent advances in smart, multimedia and computer gaming technologies. The Contributions include: *Smart Gamification and Smart Serious Games. *Fusion of secure IPsec-based Virtual Private Network, mobile computing and rich multimedia technology. *Teaching and Promoting Smart Internet of Things Solutions Using the Serious-game Approach. *Evaluation of Student Knowledge using an e-Learning Framework. *The iTEC Eduteka. *3D Virtual Worlds as a Fusion of Immersing, Visualizing, Recording, and Replaying Technologies. *Fusion of multimedia and mobile technology in audio guides for Museums and Exhibitions: from Bluetooth Push to Web Pull. The book is directed to researchers, students and software developers working in the areas of education and information technologies.
In the paper we propose a model of tax incentives optimization for inve- ment projects with a help of the mechanism of accelerated depreciation. Unlike the tax holidays which influence on effective income tax rate, accelerated - preciation affects on taxable income. In modern economic practice the state actively use for an attraction of - vestment into the creation of new enterprises such mechanisms as accelerated depreciation and tax holidays. The problem under our consideration is the following. Assume that the state (region) is interested in realization of a certain investment project, for ex- ple, the creation of a new enterprise. In order to attract a potential investor the state decides to use a mechanism of accelerated tax depreciation. The foll- ing question arise. What is a reasonable principle for choosing depreciation rate? From the state's point of view the future investor's behavior will be rat- nal. It means that while looking at economic environment the investor choose such a moment for investment which maximizes his expected net present value (NPV) from the given project. For this case both criteria and "investment rule" depend on proposed (by the state) depreciation policy. For the simplicity we will suppose that the purpose of the state for a given project is a maximi- tion of a discounted tax payments into the budget from the enterprise after its creation. Of course, these payments depend on the moment of investor's entry and, therefore, on the depreciation policy established by the state.
Along with the traditional material concerning linear programming (the simplex method, the theory of duality, the dual simplex method), In-Depth Analysis of Linear Programming contains new results of research carried out by the authors. For the first time, the criteria of stability (in the geometrical and algebraic forms) of the general linear programming problem are formulated and proved. New regularization methods based on the idea of extension of an admissible set are proposed for solving unstable (ill-posed) linear programming problems. In contrast to the well-known regularization methods, in the methods proposed in this book the initial unstable problem is replaced by a new stable auxiliary problem. This is also a linear programming problem, which can be solved by standard finite methods. In addition, the authors indicate the conditions imposed on the parameters of the auxiliary problem which guarantee its stability, and this circumstance advantageously distinguishes the regularization methods proposed in this book from the existing methods. In these existing methods, the stability of the auxiliary problem is usually only presupposed but is not explicitly investigated. In this book, the traditional material contained in the first three chapters is expounded in much simpler terms than in the majority of books on linear programming, which makes it accessible to beginners as well as those more familiar with the area.
This book provides graduate students and practitioners with knowledge of the CORBA standard and practical experience of implementing distributed systems with CORBA's Java mapping. With tested code examples that will run immediately!
Mathematical Programming and Financial Objectives for Scheduling Projects focuses on decision problems where the performance is measured in terms of money. As the title suggests, special attention is paid to financial objectives and the relationship of financial objectives to project schedules and scheduling. In addition, how schedules relate to other decisions is treated in detail. The book demonstrates that scheduling must be combined with project selection and financing, and that scheduling helps to give an answer to the planning issue of the amount of resources required for a project. The author makes clear the relevance of scheduling to cutting budget costs. The book is divided into six parts. The first part gives a brief introduction to project management. Part two examines scheduling projects in order to maximize their net present value. Part three considers capital rationing. Many decisions on selecting or rejecting a project cannot be made in isolation and multiple projects must be taken fully into account. Since the requests for capital resources depend on the schedules of the projects, scheduling taken on more complexity. Part four studies the resource usage of a project in greater detail. Part five discusses cases where the processing time of an activity is a decision to be made. Part six summarizes the main results that have been accomplished.
The BeOS is the exciting new operating system designed natively
for the Internet and digital media. Programmers are drawn to the
BeOS by its many state-of-the-art features, including pervasive
multithreading, a symmetric multiprocessing architecture, and an
integrated multithreaded graphics system. The Be engineering team
also built in many UNIX-like capabilities as part of a POSIX
toolkit. Best of all, the BeOS runs on a variety of Intel
architectures and PowerPC platforms and uses off-the-shelf
hardware. This book explores the BeOS from a POSIX programmer's point of
view, providing a comprehensive and practical guide to porting UNIX
and other POSIX-based software to the BeOS. BeOS: Porting UNIX
Applications will help you move your favorite UNIX software to an
environment designed from the ground up for high-performance
applications.
In the past several years, there have been significant technological advances in the field of crisis response. However, many aspects concerning the efficient collection and integration of geo-information, applied semantics and situation awareness for disaster management remain open. Improving crisis response systems and making them intelligent requires extensive collaboration between emergency responders, disaster managers, system designers and researchers alike. To facilitate this process, the Gi4DM (GeoInformation for Disaster Management) conferences have been held regularly since 2005. The events are coordinated by the Joint Board of Geospatial Information Societies (JB GIS) and ICSU GeoUnions. This book presents the outcomes of the Gi4DM 2018 conference, which was organised by the ISPRS-URSI Joint Working Group ICWG III/IVa: Disaster Assessment, Monitoring and Management and held in Istanbul, Turkey on 18-21 March 2018. It includes 12 scientific papers focusing on the intelligent use of geo-information, semantics and situation awareness.
Researchers working with nonlinear programming often claim "the word is non linear" indicating that real applications require nonlinear modeling. The same is true for other areas such as multi-objective programming (there are always several goals in a real application), stochastic programming (all data is uncer tain and therefore stochastic models should be used), and so forth. In this spirit we claim: The word is multilevel. In many decision processes there is a hierarchy of decision makers, and decisions are made at different levels in this hierarchy. One way to handle such hierar chies is to focus on one level and include other levels' behaviors as assumptions. Multilevel programming is the research area that focuses on the whole hierar chy structure. In terms of modeling, the constraint domain associated with a multilevel programming problem is implicitly determined by a series of opti mization problems which must be solved in a predetermined sequence. If only two levels are considered, we have one leader (associated with the upper level) and one follower (associated with the lower level)."
On August 1997 a conference titled "From Local to Global Optimiza- tion" was held at Storgarden in Rimfor.sa near the Linkoping Institute of Technology, Sweden. The conference gave us the opportunity to cel- ebrate Hoang Thy's achievements in Optimization during his 70 years of life. This book consists of a collection of research papers based on results presented during the conference and are dedicated to Professor Hoang Thy on the occasion of his 70th birthday. The papers cover a wide range of recent results in Mathematical Pro- gramming. The work of Hoang Thy, in particular in Global Optimiza- tion, has provided directions for new algorithmic developments in the field. We are indebted to the Kluwer Academic Publishers for inviting us to publish this volume, and the Center for Industrial Information Transfer (CENIIT) for financial support. We wish to thank the referees for their help and the authors for their papers. We also wish to join all contributors of this book in expressing birthday wishes and gratitude to Hoang Thy for his inspiration, support, and friendship to all of us. Athanasios Migdalas, Panos M. Pardalos, and Peter Varbrand November 1998 xv Hoang Tuy: An Appreciation Its a pleasure for me as colleague and friend to take this opportunity to celebrate Hoang 'I\lY'S numerous contributions to the field of mathemat- ical programming.
With the purpose of building upon standard web technologies, open linked data serves as a useful way to connect previously unrelated data and to publish structured data on the web. The application of these elements leads to the creation of data commons called semantic web. Cases on Open-Linked Data and Semantic Web Applications brings together new theories, research findings and case studies which cover the recent developments and approaches towards applied open linked data and semantic web in the context of information systems. By enhancing the understanding of open linked data in business, science and information technologies, this reference source aims to be useful for academics, researchers, and practitioners. With the purpose of building upon standard web technologies, open linked data serves as a useful way to connect previously unrelated data and to publish structured data on the web. The application of these elements leads to the creation of data commons called semantic web.
Recent developments in computer science clearly show the need for a
better theoretical foundation for some central issues. Methods and
results from mathematical logic, in particular proof theory and
model theory, are of great help here and will be used much more in
future than previously. This book provides an excellent
introduction to the interplay of mathematical logic and computer
science. It contains extensively reworked versions of the lectures
given at the 1997 Marktoberdorf Summer School by leading
researchers in the field.
In this book, the author considers separable programming and, in particular, one of its important cases - convex separable programming. Some general results are presented, techniques of approximating the separable problem by linear programming and dynamic programming are considered. Convex separable programs subject to inequality/ equality constraint(s) and bounds on variables are also studied and iterative algorithms of polynomial complexity are proposed. As an application, these algorithms are used in the implementation of stochastic quasigradient methods to some separable stochastic programs. Numerical approximation with respect to I1 and I4 norms, as a convex separable nonsmooth unconstrained minimization problem, is considered as well. Audience: Advanced undergraduate and graduate students, mathematical programming/ operations research specialists.
This book deals with the theory and applications of the Reformulation- Linearization/Convexification Technique (RL T) for solving nonconvex optimization problems. A unified treatment of discrete and continuous nonconvex programming problems is presented using this approach. In essence, the bridge between these two types of nonconvexities is made via a polynomial representation of discrete constraints. For example, the binariness on a 0-1 variable x . can be equivalently J expressed as the polynomial constraint x . (1-x . ) = 0. The motivation for this book is J J the role of tight linear/convex programming representations or relaxations in solving such discrete and continuous nonconvex programming problems. The principal thrust is to commence with a model that affords a useful representation and structure, and then to further strengthen this representation through automatic reformulation and constraint generation techniques. As mentioned above, the focal point of this book is the development and application of RL T for use as an automatic reformulation procedure, and also, to generate strong valid inequalities. The RLT operates in two phases. In the Reformulation Phase, certain types of additional implied polynomial constraints, that include the aforementioned constraints in the case of binary variables, are appended to the problem. The resulting problem is subsequently linearized, except that certain convex constraints are sometimes retained in XV particular special cases, in the Linearization/Convexijication Phase. This is done via the definition of suitable new variables to replace each distinct variable-product term. The higher dimensional representation yields a linear (or convex) programming relaxation.
A function is convex if its epigraph is convex. This geometrical structure has very strong implications in terms of continuity and differentiability. Separation theorems lead to optimality conditions and duality for convex problems. A function is quasiconvex if its lower level sets are convex. Here again, the geo metrical structure of the level sets implies some continuity and differentiability properties for quasiconvex functions. Optimality conditions and duality can be derived for optimization problems involving such functions as well. Over a period of about fifty years, quasiconvex and other generalized convex functions have been considered in a variety of fields including economies, man agement science, engineering, probability and applied sciences in accordance with the need of particular applications. During the last twenty-five years, an increase of research activities in this field has been witnessed. More recently generalized monotonicity of maps has been studied. It relates to generalized convexity off unctions as monotonicity relates to convexity. Generalized monotonicity plays a role in variational inequality problems, complementarity problems and more generally, in equilibrium prob lems."
Nonsmooth energy functions govern phenomena which occur frequently in nature and in all areas of life. They constitute a fascinating subject in mathematics and permit the rational understanding of yet unsolved or partially solved questions in mechanics, engineering and economics. This is the first book to provide a complete and rigorous presentation of the quasidifferentiability approach to nonconvex, possibly nonsmooth, energy functions, of the derivation and study of the corresponding variational expressions in mechanics, engineering and economics, and of their numerical treatment. The new variational formulations derived are illustrated by many interesting numerical problems. The techniques presented will permit the reader to check any solution obtained by other heuristic techniques for nonconvex, nonsmooth energy problems. A civil, mechanical or aeronautical engineer can find in the book the only existing mathematically sound technique for the formulation and study of nonconvex, nonsmooth energy problems. Audience: The book will be of interest to pure and applied mathematicians, physicists, researchers in mechanics, civil, mechanical and aeronautical engineers, structural analysts and software developers. It is also suitable for graduate courses in nonlinear mechanics, nonsmooth analysis, applied optimization, control, calculus of variations and computational mechanics.
R/3 is a business system that has gained global prominence. However, the SAP R/3 has 237,000 function modules. Quite oftenprogrammersare unaware that a module exists which can be of help in their programs. This convenient resource is a collection of the most common ABAP modules, demonstrated within simple programs. These programs for easily searchable examples can be accessed from http: //extras.springer.com/978-1-85233-775-9 The modules in this book are organised for quick reference. This concise reference contains: A full explanation of the layout of reference entries;a brief introduction to SAP; coverage of conversion and date and time modules; file and directory modules; list, long texts, and number modules; useful integration modules for MSOffice and pop-up dialog box management. This book organises over 300 modules, many of which are undocumented in text, and arranges them for quick and easy reference, and explains when and where to use the most common SAP R/3 ABAP function modules. "
In this book, three main notions will be used in the editors search of improvements in various areas of computer graphics: Artificial Intelligence, Viewpoint Complexity and Human Intelligence. Several Artificial Intelligence techniques are used in presented intelligent scene modelers, mainly declarative ones. Among them, the mostly used techniques are Expert systems, Constraint Satisfaction Problem resolution and Machine-learning. The notion of viewpoint complexity, that is complexity of a scene seen from a given viewpoint, will be used in improvement proposals for a lot of computer graphics problems like scene understanding, virtual world exploration, image-based modeling and rendering, ray tracing and radiosity. Very often, viewpoint complexity is used in conjunction with Artificial Intelligence techniques like Heuristic search and Problem resolution. The notions of artificial Intelligence and Viewpoint Complexity may help to automatically resolve a big number of computer graphics problems. However, there are special situations where is required to find a particular solution for each situation. In such a case, human intelligence has to replace, or to be combined with, artificial intelligence. Such cases, and proposed solutions are also presented in this book.
In "Distributed Algorithms," Nancy Lynch provides a blueprint
for designing, implementing, and analyzing distributed algorithms.
She directs her book at a wide audience, including students,
programmers, system designers, and researchers. "Distributed Algorithms" contains the most significant
algorithms and impossibility results in the area, all in a simple
automata-theoretic setting. The algorithms are proved correct, and
their complexity is analyzed according to precisely defined
complexity measures. The problems covered include resource
allocation, communication, consensus among distributed processes,
data consistency, deadlock detection, leader election, global
snapshots, and many others. The material is organized according to the system model first by
the timing model and then by the interprocess communication
mechanism. The material on system models is isolated in separate
chapters for easy reference. The presentation is completely rigorous, yet is intuitive enough for immediate comprehension. This book familiarizes readers with important problems, algorithms, and impossibility results in the area: readers can then recognize the problems when they arise in practice, apply the algorithms to solve them, and use the impossibility results to determine whether problems are unsolvable. The book also provides readers with the basic mathematical tools for designing new algorithms and proving new impossibility results. In addition, it teaches readers how to reason carefully about distributed algorithms to model them formally, devise precise specifications for their required behavior, prove their correctness, and evaluate their performance with realistic measures."
Today, computers fulfil a dazzling array of roles, a flexibility resulting from the great range of programs that can be run on them. "A Science of Operations" examines the history of what we now call programming, defined not simply as "computer" programming, but more broadly as the definition of the steps involved in computations and other information-processing activities. This unique perspective highlights how the history of programming is distinct from the history of the computer, despite the close relationship between the two in the 20th century. The book also discusses how the development of programming languages is related to disparate fields which attempted to give a mechanical account of language on the one hand, and a linguistic account of machines on the other. Topics and features: Covers the early development of automatic computing, including Babbage's "mechanical calculating engines" and the applications of punched-card technology, examines the theoretical work of mathematical logicians such as Kleene, Church, Post and Turing, and the machines built by Zuse and Aiken in the 1930s and 1940s, discusses the role that logic played in the development of the stored program computer, describes the "standard model" of machine-code programming popularised by Maurice Wilkes, presents the complete table for the universal Turing machine in the Appendices, investigates the rise of the initiatives aimed at developing higher-level programming notations, and how these came to be thought of as 'languages' that could be studied independently of a machine, examines the importance of the Algol 60 language, and the framework it provided for studying the design of programming languages and the process of software development and explores the early development of object-oriented languages, with a focus on the Smalltalk project. This fascinating text offers a new viewpoint for historians of science and technology, as well as for the general reader. The historical narrative builds the story in a clear and logical fashion, roughly following chronological order.
Industrial development of software systems needs to be guided by recognized engineering principles. Commercial-off-the-shelf (COTS) components enable the systematic and cost-effective reuse of prefabricated tested parts, a characteristic approach of mature engineering disciplines. This reuse necessitates a thorough test of these components to make sure that each works as specified in a real context. Beydeda and Gruhn invited leading researchers in the area of component testing to contribute to this monograph, which covers all related aspects from testing components in a context-independent manner through testing components in the context of a specific system to testing complete systems built from different components. The authors take the viewpoints of both component developers and component users, and their contributions encompass functional requirements such as correctness and functionality compliance as well as non-functional requirements like performance and robustness. Overall this monograph offers researchers, graduate students and advanced professionals a unique and comprehensive overview of the state of the art in testing COTS components and COTS-based systems.
This book deals with decision making in environments of significant data un certainty, with particular emphasis on operations and production management applications. For such environments, we suggest the use of the robustness ap proach to decision making, which assumes inadequate knowledge of the decision maker about the random state of nature and develops a decision that hedges against the worst contingency that may arise. The main motivating factors for a decision maker to use the robustness approach are: * It does not ignore uncertainty and takes a proactive step in response to the fact that forecasted values of uncertain parameters will not occur in most environments; * It applies to decisions of unique, non-repetitive nature, which are common in many fast and dynamically changing environments; * It accounts for the risk averse nature of decision makers; and * It recognizes that even though decision environments are fraught with data uncertainties, decisions are evaluated ex post with the realized data. For all of the above reasons, robust decisions are dear to the heart of opera tional decision makers. This book takes a giant first step in presenting decision support tools and solution methods for generating robust decisions in a variety of interesting application environments. Robust Discrete Optimization is a comprehensive mathematical programming framework for robust decision making.
The evolutionary approach called scatter search originated from strategies for creating composite decision rules and surrogate constraints. Recent studies demonstrate the practical advantages of this approach for solving a diverse array of optimization problems from both classical and real world settings. Scatter search contrasts with other evolutionary procedures, such as genetic algorithms, by providing unifying principles for joining solutions based on generalized path constructions in Euclidean space and by utilizing strategic designs where other approaches resort to randomization. The book's goal is to provide the basic principles and fundamental ideas that will allow the readers to create successful applications of scatter search. The book includes the C source code of the methods introduced in each chapter. From the Foreword:
Advanced approaches to software engineering and design are capable of solving complex computational problems and achieving standards of performance that were unheard of only decades ago. Handbook of Research on Emerging Advancements and Technologies in Software Engineering presents a comprehensive investigation of the most recent discoveries in software engineering research and practice, with studies in software design, development, implementation, testing, analysis, and evolution. Software designers, architects, and technologists, as well as students and educators, will find this book to be a vital and in-depth examination of the latest notable developments within the software engineering community.
This book teaches the basics of XML with an original approach, using real-world examples from an interesting (and operating) environment with broad applicability. It covers the full spectrum of Berkeley DB XML tools, including the command-line shell, transactions, rollbacks, replication, archiving and monitoring. Techniques and concepts that have broad applicability outside of the subject matter are skillfully explained: XML, XPath, XQuery, XML schemas, all industry-standard technologies that find one of their best tutorial treatments, and all in the context of a simple database solution. The book also presents a remarkable example of query power. |
You may like...
Forests of Southeast Europe Under a…
Mirjana Sijacic-Nikolic, Jelena Milovanovic, …
Hardcover
R4,095
Discovery Miles 40 950
Soil Emission of Nitrous Oxide and its…
David Ussiri, Rattan Lal
Hardcover
Preparing a Workforce for the New Blue…
Liesl Hotaling, Richard W. Spinrad
Paperback
R3,229
Discovery Miles 32 290
Biofertilizers for Sustainable…
Bhoopander Giri, Ramprasad, …
Hardcover
R6,600
Discovery Miles 66 000
Vegetation Dynamics and Crop Stress - An…
Dipanwita Dutta, Arnab Kundu, …
Paperback
R3,940
Discovery Miles 39 400
Changing Climate and Resource use…
Amitav Bhattacharya
Paperback
Silicon and Nano-silicon in…
Hassan Etesami, Abdullah H. Al Saeedi, …
Paperback
R3,480
Discovery Miles 34 800
|