![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Computer programming
New Approaches to Circle Packing into the Square is devoted to the most recent results on the densest packing of equal circles in a square. In the last few decades, many articles have considered this question, which has been an object of interest since it is a hard challenge both in discrete geometry and in mathematical programming. The authors have studied this geometrical optimization problem for a long time, and they developed several new algorithms to solve it. The book completely covers the investigations on this topic.
Providing a wide variety of technologies for ensuring the safety and dependability of cyber-physical systems (CPS), this book offers a comprehensive introduction to the architecture-centric modeling, analysis, and verification of CPS. In particular, it focuses on model driven engineering methods including architecture description languages, virtual prototyping, and formal analysis methods. CPS are based on a new design paradigm intended to enable emerging software-intensive systems. Embedded computers and networks monitor and control the physical processes, usually with the help of feedback loops where physical processes affect computations and vice versa. The principal challenges in system design lie in this constant interaction of software, hardware and physics. Developing reliable CPS has become a critical issue for the industry and society, because many applications such as transportation, power distribution, medical equipment and tele-medicine are dependent on CPS. Safety and security requirements must be ensured by means of powerful validation tools. Satisfying such requirements, including quality of service, implies having formally proven the required properties of the system before it is deployed. The book is concerned with internationally standardized modeling languages such as AADL, SysML, and MARTE. As the effectiveness of the technologies is demonstrated with industrial sample cases from the automotive and aerospace sectors, links between the methods presented and industrial problems are clearly understandable. Each chapter is self-contained, addressing specific scientific or engineering problems, and identifying further issues. In closing, it includes perspectives on future directions in CPS design from an architecture analysis viewpoint.
Polynomial extremal problems (PEP) constitute one of the most important subclasses of nonlinear programming models. Their distinctive feature is that an objective function and constraints can be expressed by polynomial functions in one or several variables. Let: e = {: e 1, ...: en} be the vector in n-dimensional real linear space Rn; n PO(: e), PI (: e), ..., Pm (: e) are polynomial functions in R with real coefficients. In general, a PEP can be formulated in the following form: (0.1) find r = inf Po(: e) subject to constraints (0.2) Pi (: e) =0, i=l, ..., m (a constraint in the form of inequality can be written in the form of equality by introducing a new variable: for example, P( x) 0 is equivalent to P(: e) + y2 = 0). Boolean and mixed polynomial problems can be written in usual form by adding for each boolean variable z the equality: Z2 - Z = O. Let a = {al, ..., a } be integer vector with nonnegative entries {a;}f=l. n Denote by R a](: e) monomial in n variables of the form: n R a](: e) = IT: ef';;=1 d(a) = 2:7=1 ai is the total degree of monomial R a]. Each polynomial in n variables can be written as sum of monomials with nonzero coefficients: P(: e) = L caR a](: e), aEA{P) IX x Nondifferentiable optimization and polynomial problems where A(P) is the set of monomials contained in polynomial P
Problem solving is an essential part of every scientific discipline. It has two components: (1) problem identification and formulation, and (2) solution of the formulated problem. One can solve a problem on its own using ad hoc techniques or follow those techniques that have produced efficient solutions to similar problems. This requires the understanding of various algorithm design techniques, how and when to use them to formulate solutions and the context appropriate for each of them. This book advocates the study of algorithm design techniques by presenting most of the useful algorithm design techniques and illustrating them through numerous examples.
Another powerful contraction began and the pain jarred her back to the reality of the task at hand. In just a few minutes, the head appeared and the event moments ago were repeated. Again, the Doktor held the baby by its feet and gave this one a good whack. Nothing happened. He tried again. There was still no cry from the baby. He laid the baby down and put his stethoscope to his tiny chest. A frown crossed his face. Nurse Kelm had seen that look before and understood. The Doktor tied and cut the cord just as he had done with the first baby and handed him to Ilse. She quickly wrapped the baby in a receiving blanket, picked it up and rushed out of the room. Freya watched this scene as if seeing it in slow motion. Where is she taking my baby? she screamed. The Doktor took her hand and said softly, I'm sorry Frau Muller, but he is dead. A heart-rending scream shattered the quiet of the room. Freya began to sob uncontrollably. The Doktor whispered to the second nurse and she handed him a syringe with a mild sedative. Freya didn't feel the needle enter her arm. She couldn't feel anything at that moment except a pain in her heart that made her oblivious to any physical pain.
System Design: A Practical Guide with SpecC presents the system design flow following a simple example through the whole process in an easy-to-follow, step-by-step fashion. Each step is described in detail in pictorial form and with code examples in SpecC. For each picture slide a detailed explanation is provided of the concepts presented. This format is suited for tutorials, seminars, self-study, as a guided reference carried by examples, or as teaching material for courses on system design. Features: Comprehensive introduction to and description of the SpecC language and design methodology; IP-centric language and methodology with focus on design reuse; Complete framework for system-level design from specification to implementation for SOCs and other embedded HW/SW systems. System Design: A Practical Guide with SpecC will benefit designers and design managers of complex SOCs, or embedded systems in general, by allowing them to develop new methodologies from these results, in order to increase design productivity by orders of magnitude. Designers at RTL, logical or physical levels, who are interested in moving up to the system level, will find a comprehensive overview within. The design models in the book define IP models and functions for IP exchange between IP providers and their users. A well-defined methodology like the one presented in this book will help product planning divisions to quickly develop new products or to derive completely new business models, like e-design or product-on-demand. Finally, researchers and students in the area of system design will find an example of a formal, well-structured design flow in this book.
Overtheyears, research inthelifescienceshasbenefitedgreatlyfromthequantita tive toolsofmathematics and modeling. Many aspectsofcomplex biological systems can be more deeply understood when mathematical techniques are incorporated into a scientific investigation. Modelingcanbefruitfully applied in many typesofbiological research, from studies on the molecular, cellular, and organ level, to experiments in wholeanimalsandinpopulations. Using the field of nutrition as an example, one can find many cases of recent advances in knowledge and understanding that were facilitated by the application of mathematical modelingtokineticdata. Theavailabilityofbiologicallyimportantstable isotope-labeled compounds, developments in sensitive mass spectrometry and other analytical techniques, and advances in the powerful modeling software applied to data haveeachcontributed toourability tocarryoutevermoresophisticated kinetic studies that are relevant to nutrition and the health sciences at many levels oforganization. Furthermore, weanticipatethatmodeling isonthebrinkofanothermajoradvance: the application of kinetic modeling to clinical practice. With advances in the abilityof modelstoaccesslargedatabases(e. g., apopulationofindividualpatientrecords)andthe developmentofuserinterfaces thatare"friendly"enough tobeused byclinicians who arenotmodelers, wepredictthathealthapplicationsmodeling willbeanimportantnew 51 directionformodelinginthe21 century. This book contains manuscripts that are based on presentations at the seventh conference in a series focused on advancing nutrition and health research by fostering exchange among scientists from such disciplines as nutrition, biology, mathematics, statistics, kinetics, andcomputing. Thethemesofthesixpreviousconferencesincluded general nutritionmodeling(CanoltyandCain, 1985;Hoover-PlowandChandra, 1988), amino acids and carbohydrates (Aburnrad, 1991), minerals (Siva Subramanian and Wastney, 1995), vitamins, proteins, andmodelingtheory(CoburnandTownsend, 1996), and physiological compartmental modeling (Clifford and Muller, 1998). The seventh conference in the series was held at The Pennsylvania State University from July 29 throughAugust1,2000. Themeetingbeganwithaninstructiveandentertainingkeynote address by Professor Britton Chance, Eldridge Reeves Johnson University Professor Emeritus of Biophysics, Physical Chemistry, and Radiologic Physics, University of Pennsylvania. Dr."
Along with the traditional material concerning linear programming (the simplex method, the theory of duality, the dual simplex method), In-Depth Analysis of Linear Programming contains new results of research carried out by the authors. For the first time, the criteria of stability (in the geometrical and algebraic forms) of the general linear programming problem are formulated and proved. New regularization methods based on the idea of extension of an admissible set are proposed for solving unstable (ill-posed) linear programming problems. In contrast to the well-known regularization methods, in the methods proposed in this book the initial unstable problem is replaced by a new stable auxiliary problem. This is also a linear programming problem, which can be solved by standard finite methods. In addition, the authors indicate the conditions imposed on the parameters of the auxiliary problem which guarantee its stability, and this circumstance advantageously distinguishes the regularization methods proposed in this book from the existing methods. In these existing methods, the stability of the auxiliary problem is usually only presupposed but is not explicitly investigated. In this book, the traditional material contained in the first three chapters is expounded in much simpler terms than in the majority of books on linear programming, which makes it accessible to beginners as well as those more familiar with the area.
As the telecommunication industry introduces new sophisticated technologies, the nature of services and the volume of demands have changed. Indeed, a broad range of new services for users appear, combining voice, data, graphics, video, etc. This implies new planning issues. Fiber transmission systems that can carry large amounts of data on a few strands of wire were introduced. These systems have such a large bandwidth that the failure of even a single transmission link: in the network can create a severe service loss to customers. Therefore, a very high level of service reliability is becoming imperative for both system users and service providers. Since equipment failures and accidents cannot be avoided entirely, networks have to be designed so as to "survive" failures. This is done by judiciously installing spare capacity over the network so that all traffic interrupted by a failure may be diverted around that failure by way of this spare or reserve capacity. This of course translates into huge investments for network operators. Designing such survivable networks while minimizing spare capacity costs is, not surprisingly, a major concern of operating companies which gives rise to very difficult combinatorial problems. In order to make telecommunication networks survivable, one can essentially use two different strategies: protection or restoration. The protection approach preas signs spare capacity to protect each element of the network independently, while the restoration approach spreads the redundant capacity over the whole network and uses it as required in order to restore the disrupted traffic."
The core idea of this book is that object- oriented technology is a generic technology whose various technical aspects can be presented in a unified and consistent framework. This applies to both practical and formal aspects of object-oriented technology. Course tested in a variety of object-oriented courses, numerous examples, figures and exercises are presented in each chapter. The approach in this book is based on typed technologies, and the core notions fit mainstream object-oriented languages such as Java and C#. The book promotes object-oriented constraints (assertions), their specification and verification. Object-oriented constraints apply to specification and verification of object-oriented programs, specification of the object-oriented platform, more advanced concurrent models, database integrity constraints and object-oriented transactions, their specification and verification.
In the paper we propose a model of tax incentives optimization for inve- ment projects with a help of the mechanism of accelerated depreciation. Unlike the tax holidays which influence on effective income tax rate, accelerated - preciation affects on taxable income. In modern economic practice the state actively use for an attraction of - vestment into the creation of new enterprises such mechanisms as accelerated depreciation and tax holidays. The problem under our consideration is the following. Assume that the state (region) is interested in realization of a certain investment project, for ex- ple, the creation of a new enterprise. In order to attract a potential investor the state decides to use a mechanism of accelerated tax depreciation. The foll- ing question arise. What is a reasonable principle for choosing depreciation rate? From the state's point of view the future investor's behavior will be rat- nal. It means that while looking at economic environment the investor choose such a moment for investment which maximizes his expected net present value (NPV) from the given project. For this case both criteria and "investment rule" depend on proposed (by the state) depreciation policy. For the simplicity we will suppose that the purpose of the state for a given project is a maximi- tion of a discounted tax payments into the budget from the enterprise after its creation. Of course, these payments depend on the moment of investor's entry and, therefore, on the depreciation policy established by the state.
This unique text/reference reviews algorithms for the exact or approximate solution of shortest-path problems, with a specific focus on a class of algorithms called rubberband algorithms. Discussing each concept and algorithm in depth, the book includes mathematical proofs for many of the given statements. Topics and features: provides theoretical and programming exercises at the end of each chapter; presents a thorough introduction to shortest paths in Euclidean geometry, and the class of algorithms called rubberband algorithms; discusses algorithms for calculating exact or approximate ESPs in the plane; examines the shortest paths on 3D surfaces, in simple polyhedrons and in cube-curves; describes the application of rubberband algorithms for solving art gallery problems, including the safari, zookeeper, watchman, and touring polygons route problems; includes lists of symbols and abbreviations, in addition to other appendices.
A practical step-by-step approach for improving the software development process within a company, using the Software Engineering Institute's Capability Maturity Model (CMM). The text explains common misconceptions associated with Software Business Improvement and CMM, using real-world examples. The book includes a reference table of key software metrics, which: help the reader evaluate measurements in relation to the functioning of his/her organisation; direct the software development to achieve higher levels of CMM in a timely manner; link measurement techniques to specific KPAs in a practical manner; and improve software process definition and improvement techniques with CMM as a guideline.
2 Concept ( Tools * Specification ( Tools + Design Stages ( Tools * Implementation ( Tools Figure 1-1. A nominal, multi-stage development process From that beginning, we have progressed to the point where the EDA community at large, including both users and developers of the tools, are interested in more unified environments. Here, the notion is that the tools used at the various stages in the development process need to be able to complement each other, and to communicate with one another efficiently using effective file exchange capabilities. Furthermore, the idea of capturing all the tool support needed for an EDA development into a unified support environment is now becoming a reality. This reality is evidenced by some of the EDA suites we now see emerging, wherein several tool functions are integrated under a common graphical user interface (GUI), with supporting file exchange and libraries to enable all tool functions to operate effectively and synergistically. This concept, which we illustrate in Figure 1- 2, is the true future ofEDA.
The latest edition of a classic text on concurrency and distributed programming - from a winner of the ACM/SIGCSE Award for Outstanding Contribution to Computer Science Education.
grams of which the objective is given by the ratio of a convex by a positive (over a convex domain) concave function. As observed by Sniedovich (Ref. [102, 103]) most of the properties of fractional pro grams could be found in other programs, given that the objective function could be written as a particular composition of functions. He called this new field C programming, standing for composite concave programming. In his seminal book on dynamic programming (Ref. [104]), Sniedovich shows how the study of such com positions can help tackling non-separable dynamic programs that otherwise would defeat solution. Barros and Frenk (Ref. [9]) developed a cutting plane algorithm capable of optimizing C-programs. More recently, this algorithm has been used by Carrizosa and Plastria to solve a global optimization problem in facility location (Ref. [16]). The distinction between global optimization problems (Ref. [54]) and generalized convex problems can sometimes be hard to establish. That is exactly the reason why so much effort has been placed into finding an exhaustive classification of the different weak forms of convexity, establishing a new definition just to satisfy some desirable property in the most general way possible. This book does not aim at all the subtleties of the different generalizations of convexity, but concentrates on the most general of them all, quasiconvex programming. Chapter 5 shows clearly where the real difficulties appear.
This book represents the proceedings of the 9th SDL Forum which was
held in Montreal, Quebec, Canada, during the week of June 21-25,
1999. The 9th SDL Forum presents papers on the past and future
development of the MSC and SDL languages.The volume presents
information on experience with the use of these languages in
industrial development projects, on tools and techniques for using
these languages in the software and hardware development process,
and other aspects of these languages.
Many approaches have been proposed to enhance software productivity and reliability. These approaches typically fall into three categories: the engineering approach, the formal approach, and the knowledge-based approach. The optimal gain in software productivity cannot be obtained if one relies on only one of these approaches. Thus, the integration of different approaches has also become a major area of research. No approach can be said to be perfect if it fails to satisfy the following two criteria. Firstly, a good approach should support the full life cycle of software development. Secondly, a good approach should support the development of large-scale software for real use in many application domains. Such an approach can be referred to as a five-in-one approach. The authors of this book have, for the past eight years, conducted research in knowledge-based software engineering, of which the final goal is to develop a paradigm for software engineering which not only integrates the three approaches mentioned above, but also fulfils the two criteria on which the five-in-one approach is based. Domain Modeling- Based Software Engineering: A Formal Approach explores the results of this research. Domain Modeling-Based Software Engineering: A Formal Approach will be useful to researchers of knowledge-based software engineering, students and instructors of computer science, and software engineers who are working on large-scale projects of software development and want to use knowledge-based development methods in their work.
The BeOS is the exciting new operating system designed natively
for the Internet and digital media. Programmers are drawn to the
BeOS by its many state-of-the-art features, including pervasive
multithreading, a symmetric multiprocessing architecture, and an
integrated multithreaded graphics system. The Be engineering team
also built in many UNIX-like capabilities as part of a POSIX
toolkit. Best of all, the BeOS runs on a variety of Intel
architectures and PowerPC platforms and uses off-the-shelf
hardware. This book explores the BeOS from a POSIX programmer's point of
view, providing a comprehensive and practical guide to porting UNIX
and other POSIX-based software to the BeOS. BeOS: Porting UNIX
Applications will help you move your favorite UNIX software to an
environment designed from the ground up for high-performance
applications.
This monograph book is focused on the recent advances in smart, multimedia and computer gaming technologies. The Contributions include: *Smart Gamification and Smart Serious Games. *Fusion of secure IPsec-based Virtual Private Network, mobile computing and rich multimedia technology. *Teaching and Promoting Smart Internet of Things Solutions Using the Serious-game Approach. *Evaluation of Student Knowledge using an e-Learning Framework. *The iTEC Eduteka. *3D Virtual Worlds as a Fusion of Immersing, Visualizing, Recording, and Replaying Technologies. *Fusion of multimedia and mobile technology in audio guides for Museums and Exhibitions: from Bluetooth Push to Web Pull. The book is directed to researchers, students and software developers working in the areas of education and information technologies.
This book provides graduate students and practitioners with knowledge of the CORBA standard and practical experience of implementing distributed systems with CORBA's Java mapping. With tested code examples that will run immediately!
Mathematical Programming and Financial Objectives for Scheduling Projects focuses on decision problems where the performance is measured in terms of money. As the title suggests, special attention is paid to financial objectives and the relationship of financial objectives to project schedules and scheduling. In addition, how schedules relate to other decisions is treated in detail. The book demonstrates that scheduling must be combined with project selection and financing, and that scheduling helps to give an answer to the planning issue of the amount of resources required for a project. The author makes clear the relevance of scheduling to cutting budget costs. The book is divided into six parts. The first part gives a brief introduction to project management. Part two examines scheduling projects in order to maximize their net present value. Part three considers capital rationing. Many decisions on selecting or rejecting a project cannot be made in isolation and multiple projects must be taken fully into account. Since the requests for capital resources depend on the schedules of the projects, scheduling taken on more complexity. Part four studies the resource usage of a project in greater detail. Part five discusses cases where the processing time of an activity is a decision to be made. Part six summarizes the main results that have been accomplished.
The present economic and social environment has given rise to new situations within which companies must operate. As a first example, the globalization of the economy and the need for performance has led companies to outsource and then to operate inside networks of enterprises such as supply chains or virtual enterprises. A second instance is related to environmental issues. The statement about the impact of ind- trial activities on the environment has led companies to revise processes, to save - ergy, to optimize transportation.... A last example relates to knowledge. Knowledge is considered today to be one of the main assets of a company. How to capitalize, to manage, to reuse it for the benefit of the company is an important current issue. The three examples above have no direct links. However, each of them constitutes a challenge that companies have to face today. This book brings together the opinions of several leading researchers from all around the world. Together they try to develop new approaches and find answers to those challenges. Through the individual ch- ters of this book, the authors present their understanding of the different challenges, the concepts on which they are working, the approaches they are developing and the tools they propose. The book is composed of six parts; each one focuses on a specific theme and is subdivided into subtopics.
Researchers working with nonlinear programming often claim "the word is non linear" indicating that real applications require nonlinear modeling. The same is true for other areas such as multi-objective programming (there are always several goals in a real application), stochastic programming (all data is uncer tain and therefore stochastic models should be used), and so forth. In this spirit we claim: The word is multilevel. In many decision processes there is a hierarchy of decision makers, and decisions are made at different levels in this hierarchy. One way to handle such hierar chies is to focus on one level and include other levels' behaviors as assumptions. Multilevel programming is the research area that focuses on the whole hierar chy structure. In terms of modeling, the constraint domain associated with a multilevel programming problem is implicitly determined by a series of opti mization problems which must be solved in a predetermined sequence. If only two levels are considered, we have one leader (associated with the upper level) and one follower (associated with the lower level)." |
You may like...
Correct Software in Web Applications and…
Bernhard Thalheim, Klaus-Dieter Schewe, …
Hardcover
Querying XML - XQuery, XPath, and…
Jim Melton, Stephen Buxton
Paperback
R1,479
Discovery Miles 14 790
News Search, Blogs and Feeds - A Toolkit
Lars Vage, Lars Iselid
Paperback
R1,332
Discovery Miles 13 320
|