![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Computer programming
Enterprises have made amazing advances by taking advantage of data about their business to provide predictions and understanding of their customers, markets, and products. But as the world of business becomes more interconnected and global, enterprise data is no long a monolith; it is just a part of a vast web of data. Managing data on a world-wide scale is a key capability for any business today. The Semantic Web treats data as a distributed resource on the scale of the World Wide Web, and incorporates features to address the challenges of massive data distribution as part of its basic design. The aim of the first two editions was to motivate the Semantic Web technology stack from end-to-end; to describe not only what the Semantic Web standards are and how they work, but also what their goals are and why they were designed as they are. It tells a coherent story from beginning to end of how the standards work to manage a world-wide distributed web of knowledge in a meaningful way. The third edition builds on this foundation to bring Semantic Web practice to enterprise. Fabien Gandon joins Dean Allemang and Jim Hendler, bringing with him years of experience in global linked data, to open up the story to a modern view of global linked data. While the overall story is the same, the examples have been brought up to date and applied in a modern setting, where enterprise and global data come together as a living, linked network of data. Also included with the third edition, all of the data sets and queries are available online for study and experimentation at data.world/swwo.
A modern information retrieval system must have the capability to find, organize and present very different manifestations of information - such as text, pictures, videos or database records - any of which may be of relevance to the user. However, the concept of relevance, while seemingly intuitive, is actually hard to define, and it's even harder to model in a formal way. Lavrenko does not attempt to bring forth a new definition of relevance, nor provide arguments as to why any particular definition might be theoretically superior or more complete. Instead, he takes a widely accepted, albeit somewhat conservative definition, makes several assumptions, and from them develops a new probabilistic model that explicitly captures that notion of relevance. With this book, he makes two major contributions to the field of information retrieval: first, a new way to look at topical relevance, complementing the two dominant models, i.e., the classical probabilistic model and the language modeling approach, and which explicitly combines documents, queries, and relevance in a single formalism; second, a new method for modeling exchangeable sequences of discrete random variables which does not make any structural assumptions about the data and which can also handle rare events. Thus his book is of major interest to researchers and graduate students in information retrieval who specialize in relevance modeling, ranking algorithms, and language modeling.
Advice involves recommendations on what to think; through thought, on what to choose; and via choices, on how to act. Advice is information that moves by communication, from advisors to the recipient of advice. Ivan Jureta offers a general way to analyze advice. The analysis applies regardless of what the advice is about and from whom it comes or to whom it needs to be given, and it concentrates on the production and consumption of advice independent of the field of application. It is made up of two intertwined parts, a conceptual analysis and an analysis of the rationale of advice. He premises that giving advice is a design problem and he treats advice as an artifact designed and used to influence decisions. What is unusual is the theoretical backdrop against which the author's discussions are set: ontology engineering, conceptual analysis, and artificial intelligence. While classical decision theory would be expected to play a key role, this is not the case here for one principal reason: the difficulty of having relevant numerical, quantitative estimates of probability and utility in most practical situations. Instead conceptual models and mathematical logic are the author's tools of choice. The book is primarily intended for graduate students and researchers of management science. They are offered a general method of analysis that applies to giving and receiving advice when the decision problems are not well structured, and when there is imprecise, unclear, incomplete, or conflicting qualitative information.
Along with the traditional material concerning linear programming (the simplex method, the theory of duality, the dual simplex method), In-Depth Analysis of Linear Programming contains new results of research carried out by the authors. For the first time, the criteria of stability (in the geometrical and algebraic forms) of the general linear programming problem are formulated and proved. New regularization methods based on the idea of extension of an admissible set are proposed for solving unstable (ill-posed) linear programming problems. In contrast to the well-known regularization methods, in the methods proposed in this book the initial unstable problem is replaced by a new stable auxiliary problem. This is also a linear programming problem, which can be solved by standard finite methods. In addition, the authors indicate the conditions imposed on the parameters of the auxiliary problem which guarantee its stability, and this circumstance advantageously distinguishes the regularization methods proposed in this book from the existing methods. In these existing methods, the stability of the auxiliary problem is usually only presupposed but is not explicitly investigated. In this book, the traditional material contained in the first three chapters is expounded in much simpler terms than in the majority of books on linear programming, which makes it accessible to beginners as well as those more familiar with the area.
This book provides graduate students and practitioners with knowledge of the CORBA standard and practical experience of implementing distributed systems with CORBA's Java mapping. With tested code examples that will run immediately!
In recent years, IT application scenarios have evolved in very
innovative ways. Highly distributed networks have now become a
common platform for large-scale distributed programming, high
bandwidth communications are inexpensive and widespread, and most
of our work tools are equipped with processors enabling us to
perform a multitude of tasks. In addition, mobile computing
(referring specifically to wireless devices and, more broadly, to
dynamically configured systems) has made it possible to exploit
interaction in novel ways. -Algorithms, Complexity and Models of Computation;
Mathematical Programming and Financial Objectives for Scheduling Projects focuses on decision problems where the performance is measured in terms of money. As the title suggests, special attention is paid to financial objectives and the relationship of financial objectives to project schedules and scheduling. In addition, how schedules relate to other decisions is treated in detail. The book demonstrates that scheduling must be combined with project selection and financing, and that scheduling helps to give an answer to the planning issue of the amount of resources required for a project. The author makes clear the relevance of scheduling to cutting budget costs. The book is divided into six parts. The first part gives a brief introduction to project management. Part two examines scheduling projects in order to maximize their net present value. Part three considers capital rationing. Many decisions on selecting or rejecting a project cannot be made in isolation and multiple projects must be taken fully into account. Since the requests for capital resources depend on the schedules of the projects, scheduling taken on more complexity. Part four studies the resource usage of a project in greater detail. Part five discusses cases where the processing time of an activity is a decision to be made. Part six summarizes the main results that have been accomplished.
The BeOS is the exciting new operating system designed natively
for the Internet and digital media. Programmers are drawn to the
BeOS by its many state-of-the-art features, including pervasive
multithreading, a symmetric multiprocessing architecture, and an
integrated multithreaded graphics system. The Be engineering team
also built in many UNIX-like capabilities as part of a POSIX
toolkit. Best of all, the BeOS runs on a variety of Intel
architectures and PowerPC platforms and uses off-the-shelf
hardware. This book explores the BeOS from a POSIX programmer's point of
view, providing a comprehensive and practical guide to porting UNIX
and other POSIX-based software to the BeOS. BeOS: Porting UNIX
Applications will help you move your favorite UNIX software to an
environment designed from the ground up for high-performance
applications.
In the past several years, there have been significant technological advances in the field of crisis response. However, many aspects concerning the efficient collection and integration of geo-information, applied semantics and situation awareness for disaster management remain open. Improving crisis response systems and making them intelligent requires extensive collaboration between emergency responders, disaster managers, system designers and researchers alike. To facilitate this process, the Gi4DM (GeoInformation for Disaster Management) conferences have been held regularly since 2005. The events are coordinated by the Joint Board of Geospatial Information Societies (JB GIS) and ICSU GeoUnions. This book presents the outcomes of the Gi4DM 2018 conference, which was organised by the ISPRS-URSI Joint Working Group ICWG III/IVa: Disaster Assessment, Monitoring and Management and held in Istanbul, Turkey on 18-21 March 2018. It includes 12 scientific papers focusing on the intelligent use of geo-information, semantics and situation awareness.
Researchers working with nonlinear programming often claim "the word is non linear" indicating that real applications require nonlinear modeling. The same is true for other areas such as multi-objective programming (there are always several goals in a real application), stochastic programming (all data is uncer tain and therefore stochastic models should be used), and so forth. In this spirit we claim: The word is multilevel. In many decision processes there is a hierarchy of decision makers, and decisions are made at different levels in this hierarchy. One way to handle such hierar chies is to focus on one level and include other levels' behaviors as assumptions. Multilevel programming is the research area that focuses on the whole hierar chy structure. In terms of modeling, the constraint domain associated with a multilevel programming problem is implicitly determined by a series of opti mization problems which must be solved in a predetermined sequence. If only two levels are considered, we have one leader (associated with the upper level) and one follower (associated with the lower level)."
This book introduces wireless personal communications from the point of view of wireless communication system researchers. Existing sources on wireless communications put more emphasis on simulation and fundamental principles of how to build a study model. In this volume, the aim is to pass on to readers as much knowledge as is essential for completing model building of wireless communications, focusing on wireless personal area networks (WPANs). This book is the first of its kind that gives step-by-step details on how to build the WPANs simulation model. It is most helpful for readers to get a clear picture of the whole wireless simulation model by being presented with many study models. The book is also the first treatise on wireless communication that gives a comprehensive introduction to data-length complexity and the computational complexity of the processed data and the error control schemes. This volume is useful for all academic and technical staff in the fields of telecommunications and wireless communications, as it presents many scenarios for enhancing techniques for weak error control performance and other scenarios for complexity reduction of the wireless data and image transmission. Many examples are given to help readers to understand the material covered in the book. Additional resources such as the MATLAB codes for some of the examples also are presented.
Since the early 1990s, genetic programming (GP) a discipline whose
goal is to enable the automatic generation of computer programs has
emerged as one of the most promising paradigms for fast, productive
software development. GP combines biological metaphors gleaned from
Darwin's theory of evolution with computer-science approaches drawn
from the field of machine learning to create programs that are
capable of adapting or recreating themselves for open-ended tasks.
The purpose of the 11th International Conference on Software Engineering Research, Management and Applications (SERA 2013) held on August 7 - 9, 2012 in Prague, Czech Republic was to bring together scientists, engineers, computer users, and students to share their experiences and exchange new ideas and research results about all aspects (theory, applications and tools) of Software Engineering Research, Management and Applications, and to discuss the practical challenges encountered along the way and the solutions adopted to solve them. The conference organizers selected 17 outstanding papers from those papers accepted for presentation at the conference in order to publish them in this volume. The papers were chosen based on review scores submitted by members of the program committee, and further rigorous rounds of review.
This book contains a collection of survey papers in the areas of algorithms, lan guages and complexity, the three areas in which Professor Ronald V. Book has made significant contributions. As a fonner student and a co-author who have been influenced by him directly, we would like to dedicate this book to Professor Ronald V. Book to honor and celebrate his sixtieth birthday. Professor Book initiated his brilliant academic career in 1958, graduating from Grinnell College with a Bachelor of Arts degree. He obtained a Master of Arts in Teaching degree in 1960 and a Master of Arts degree in 1964 both from Wesleyan University, and a Doctor of Philosophy degree from Harvard University in 1969, under the guidance of Professor Sheila A. Greibach. Professor Book's research in discrete mathematics and theoretical com puter science is reflected in more than 150 scientific publications. These works have made a strong impact on the development of several areas of theoretical computer science. A more detailed summary of his scientific research appears in this volume separately."
This book addresses patient-specific modeling. It integrates computational modeling, experimental procedures, imagine clinical segmentation and mesh generation with the finite element method (FEM) to solve problems in computational biomedicine and bioengineering. Specific areas of interest include cardiovascular problems, ocular and muscular systems and soft tissue modeling. Patient-specific modeling has been the subject of serious research over the last seven years and interest in the area is continually growing and this area is expected to further develop in the near future.
Robust Technology with Analysis of Interference in Signal Processing discusses for the first time the theoretical fundamentals and algorithms of analysis of noise as an information carrier. On their basis the robust technology of noisy signals processing is developed. This technology can be applied to solving the problems of control, identification, diagnostics, and pattern recognition in petrochemistry, energetics, geophysics, medicine, physics, aviation, and other sciences and industries. The text explores the emergent possibility of forecasting failures on various objects, in conjunction with the fact that failures follow the hidden microchanges revealed via interference estimates. This monograph is of interest to students, postgraduates, engineers, scientific associates and others who are concerned with the processing of measuring information on computers.
The book provides the reader with a clear understanding of what software reuse is, where the problems are, what benefits to expect, the activities, and different forms of software reuse. The reader is also given an overview of what sofware components are, different kinds of components and compositions, a taxonomy thereof, and examples of successful component reuse. An introduction to software engineering and software process models is also provided. Consequences and influences of systematic reuse of software components are depicted, and activities like domain engineering, component engineering and application engineering are described. The importance of documentation is taken into consideration as well.
This book is an up-to-date self-contained compendium of the research carried out by the authors on model-based diagnosis of a class of discrete-event systems called active systems. After defining the diagnosis problem, the book copes with a variety of reasoning mechanisms that generate the diagnosis, possibly within a monitoring setting. The book is structured into twelve chapters, each of which has its own introduction and concludes with bibliographic notes and itemized summaries. Concepts and techniques are presented with the help of numerous examples, figures, and tables, and when appropriate these concepts are formalized into propositions and theorems, while detailed algorithms are expressed in pseudocode. This work is primarily intended for researchers, professionals, and graduate students in the fields of artificial intelligence and control theory.
Privacy requirements have an increasing impact on the realization of modern applications. Commercial and legal regulations demand that privacy guarantees be provided whenever sensitive information is stored, processed, or communicated to external parties. Current approaches encrypt sensitive data, thus reducing query execution efficiency and preventing selective information release. Preserving Privacy in Data Outsourcing presents a comprehensive approach for protecting highly sensitive information when it is stored on systems that are not under the data owner's control. The approach illustrated combines access control and encryption, enforcing access control via structured encryption. This solution, coupled with efficient algorithms for key derivation and distribution, provides efficient and secure authorization management on outsourced data, allowing the data owner to outsource not only the data but the security policy itself. To reduce the amount of data to be encrypted the book also investigates data fragmentation as a possible way to protect privacy of data associations and provide fragmentation as a complementary means for protecting privacy: associations broken by fragmentation will be visible only to users authorized (by knowing the proper key) to join fragments. The book finally investigates the problem of executing queries over possible data distributed at different servers and which must be controlled to ensure sensitive information and sensitive associations be visible only to parties authorized for that. Case Studies are provided throughout the book. Privacy, data mining, data protection, data outsourcing, electronic commerce, machine learning professionals and others working in these related fields will find this book a valuable asset, as well as primary associations such as ACM, IEEE and Management Science. This book is also suitable for advanced level students and researchers concentrating on computer science as a secondary text or reference book.
On August 1997 a conference titled "From Local to Global Optimiza- tion" was held at Storgarden in Rimfor.sa near the Linkoping Institute of Technology, Sweden. The conference gave us the opportunity to cel- ebrate Hoang Thy's achievements in Optimization during his 70 years of life. This book consists of a collection of research papers based on results presented during the conference and are dedicated to Professor Hoang Thy on the occasion of his 70th birthday. The papers cover a wide range of recent results in Mathematical Pro- gramming. The work of Hoang Thy, in particular in Global Optimiza- tion, has provided directions for new algorithmic developments in the field. We are indebted to the Kluwer Academic Publishers for inviting us to publish this volume, and the Center for Industrial Information Transfer (CENIIT) for financial support. We wish to thank the referees for their help and the authors for their papers. We also wish to join all contributors of this book in expressing birthday wishes and gratitude to Hoang Thy for his inspiration, support, and friendship to all of us. Athanasios Migdalas, Panos M. Pardalos, and Peter Varbrand November 1998 xv Hoang Tuy: An Appreciation Its a pleasure for me as colleague and friend to take this opportunity to celebrate Hoang 'I\lY'S numerous contributions to the field of mathemat- ical programming.
The methodology described in this book is the result of many years of research experience in the field of synthesizable VHDL design targeting FPGA based platforms. VHDL was first conceived as a documentation language for ASIC designs. Afterwards, the language was used for the behavioral simulation of ASICs, and also as a design input for synthesis tools. VHDL is a rich language, but just a small subset of it can be used to write synthesizable code, from which a physical circuit can be obtained. Usually VHDL books describe both, synthesis and simulation aspects of the language, but in this book the reader is conducted just through the features acceptable by synthesis tools. The book introduces the subjects in a gradual and concise way, providing just enough information for the reader to develop their synthesizable digital systems in VHDL. The examples in the book were planned targeting an FPGA platform widely used around the world.
The core idea of this book is that object- oriented technology is a generic technology whose various technical aspects can be presented in a unified and consistent framework. This applies to both practical and formal aspects of object-oriented technology. Course tested in a variety of object-oriented courses, numerous examples, figures and exercises are presented in each chapter. The approach in this book is based on typed technologies, and the core notions fit mainstream object-oriented languages such as Java and C#. The book promotes object-oriented constraints (assertions), their specification and verification. Object-oriented constraints apply to specification and verification of object-oriented programs, specification of the object-oriented platform, more advanced concurrent models, database integrity constraints and object-oriented transactions, their specification and verification.
This book provides an extensive review of three interrelated issues: land fragmentation, land consolidation, and land reallocation, and it presents in detail the theoretical background, design, development and application of a prototype integrated planning and decision support system for land consolidation. The system integrates geographic information systems (GIS) and artificial intelligence techniques including expert systems (ES) and genetic algorithms (GAs) with multi-criteria decision methods (MCDM), both multi-attribute (MADM) and multi-objective (MODM). The system is based on four modules for measuring land fragmentation; automatically generating alternative land redistribution plans; evaluating those plans; and automatically designing the land partitioning plan. The presented research provides a new scientific framework for land-consolidation planning both in terms of theory and practice, by presenting new findings and by developing better tools and methods embedded in an integrated GIS environment. It also makes a valuable contribution to the fields of GIS and spatial planning, as it provides new methods and ideas that could be applied to improve the former for the benefit of the latter in the context of planning support systems. From the 1960s, ambitious research activities set out to observe regarding IT-support of the complex and time consuming redistribution processes within land consolidation without any practically relevant results, until now. This scientific work is likely to close that gap. This distinguished publication is highly recommended to land consolidation planning experts, researchers and academics alike. Prof. Dr.-Ing. Joachim Thomas, Munster/ Germany Prof. Michael Batty, University College London"
A practical step-by-step approach for improving the software development process within a company, using the Software Engineering Institute's Capability Maturity Model (CMM). The text explains common misconceptions associated with Software Business Improvement and CMM, using real-world examples. The book includes a reference table of key software metrics, which: help the reader evaluate measurements in relation to the functioning of his/her organisation; direct the software development to achieve higher levels of CMM in a timely manner; link measurement techniques to specific KPAs in a practical manner; and improve software process definition and improvement techniques with CMM as a guideline.
The author introduces the reader to the creation and implementation of space-related models by applying a learning-by-doing and problem-oriented approach. The required procedural skills are rarely taught at universities and many scientists and engineers struggle to transfer a model into a computer program. The purpose of this book is to fill this gap. It moves from simple to more complex applications, covering various important topics in the sequence: dynamic matrix processing, 2D and 3D graphics, databases, Java applets and parallel computing. A file (SMOP.zip) with all examples can be downloaded free of charge from the Internet at http: //de.geocities.com/bsttc2/book. |
You may like...
Hardware Accelerator Systems for…
Shiho Kim, Ganesh Chandra Deka
Hardcover
R3,950
Discovery Miles 39 500
Clean Architecture - A Craftsman's Guide…
Robert Martin
Paperback
(1)
Introduction to Computational Economics…
Hans Fehr, Fabian Kindermann
Hardcover
R4,258
Discovery Miles 42 580
Introducing Delphi Programming - Theory…
John Barrow, Linda Miller, …
Paperback
(1)R751 Discovery Miles 7 510
|