Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Computing & IT > Computer programming
In Part I, the impact of an integro-differential operator on parity logic engines (PLEs) as a tool for scientific modeling from scratch is presented. Part II outlines the fuzzy structural modeling approach for building new linear and nonlinear dynamical causal forecasting systems in terms of fuzzy cognitive maps (FCMs). Part III introduces the new type of autogenetic algorithms (AGAs) to the field of evolutionary computing. Altogether, these PLEs, FCMs, and AGAs may serve as conceptual and computational power tools.
While most discoverability evaluation studies in the Library and Information Science field discuss the intersection of discovery layers and library systems, this book looks specifically at digital repositories, examining discoverability from the lenses of system structure, user searches, and external discovery avenues. Discoverability, the ease with which information can be found by a user, is the cornerstone of all successful digital information platforms. Yet, most digital repository practitioners and researchers lack a holistic and comprehensive understanding of how and where discoverability happens. This book brings together current understandings of user needs and behaviors and poses them alongside a deeper examination of digital repositories around the theme of discoverability. It examines discoverability in digital repositories from both user and system perspectives by exploring how users access content (including their search patterns and habits, need for digital content, effects of outreach, or integration with Wikipedia and other web-based tools) and how systems support or prevent discoverability through the structure or quality of metadata, system interfaces, exposure to search engines or lack thereof, and integration with library discovery tools. Discoverability in Digital Repositories will be particularly useful to digital repository managers, practitioners, and researchers, metadata librarians, systems librarians, and user studies, usability and user experience librarians. Additionally, and perhaps most prominently, this book is composed with the emerging practitioner in mind. Instructors and students in Library and Information Science and Information Management programs will benefit from this book that specifically addresses discoverability in digital repository systems and services.
Numerical computation, knowledge discovery and statistical data analysis integrated with powerful 2D and 3D graphics for visualization are the key topics of this book. The Python code examples powered by the Java platform can easily be transformed to other programming languages, such as Java, Groovy, Ruby and BeanShell. This book equips the reader with a computational platform which, unlike other statistical programs, is not limited by a single programming language.The author focuses on practical programming aspects and covers a broad range of topics, from basic introduction to the Python language on the Java platform (Jython), to descriptive statistics, symbolic calculations, neural networks, non-linear regression analysis and many other data-mining topics. He discusses how to find regularities in real-world data, how to classify data, and how to process data for knowledge discoveries. The code snippets are so short that they easily fit into single pages. Numeric Computation and Statistical Data Analysis on the Java Platform is a great choice for those who want to learn how statistical data analysis can be done using popular programming languages, who want to integrate data analysis algorithms in full-scale applications, and deploy such calculations on the web pages or computational servers regardless of their operating system. It is an excellent reference for scientific computations to solve real-world problems using a comprehensive stack of open-source Java libraries included in the DataMelt (DMelt) project and will be appreciated by many data-analysis scientists, engineers and students.
Estimation of Distribution Algorithms: A New Tool for Evolutionary Computation is devoted to a new paradigm for evolutionary computation, named estimation of distribution algorithms (EDAs). This new class of algorithms generalizes genetic algorithms by replacing the crossover and mutation operators with learning and sampling from the probability distribution of the best individuals of the population at each iteration of the algorithm. Working in such a way, the relationships between the variables involved in the problem domain are explicitly and effectively captured and exploited. This text constitutes the first compilation and review of the techniques and applications of this new tool for performing evolutionary computation. Estimation of Distribution Algorithms: A New Tool for Evolutionary Computation is clearly divided into three parts. Part I is dedicated to the foundations of EDAs. In this part, after introducing some probabilistic graphical models - Bayesian and Gaussian networks - a review of existing EDA approaches is presented, as well as some new methods based on more flexible probabilistic graphical models. A mathematical modeling of discrete EDAs is also presented. Part II covers several applications of EDAs in some classical optimization problems: the travelling salesman problem, the job scheduling problem, and the knapsack problem. EDAs are also applied to the optimization of some well-known combinatorial and continuous functions. Part III presents the application of EDAs to solve some problems that arise in the machine learning field: feature subset selection, feature weighting in K-NN classifiers, rule induction, partial abductive inference in Bayesian networks, partitional clustering, and the search for optimal weights in artificial neural networks. Estimation of Distribution Algorithms: A New Tool for Evolutionary Computation is a useful and interesting tool for researchers working in the field of evolutionary computation and for engineers who face real-world optimization problems. This book may also be used by graduate students and researchers in computer science. ... I urge those who are interested in EDAs to study this well-crafted book today.' David E. Goldberg, University of Illinois Champaign-Urbana.
Graph algorithms is a well-established subject in mathematics and computer science. Beyond classical application fields, such as approximation, combinatorial optimization, graphics, and operations research, graph algorithms have recently attracted increased attention from computational molecular biology and computational chemistry. Centered around the fundamental issue of graph isomorphism, this text goes beyond classical graph problems of shortest paths, spanning trees, flows in networks, and matchings in bipartite graphs. Advanced algorithmic results and techniques of practical relevance are presented in a coherent and consolidated way. This book introduces graph algorithms on an intuitive basis followed by a detailed exposition in a literate programming style, with correctness proofs as well as worst-case analyses. Furthermore, full C++ implementations of all algorithms presented are given using the LEDA library of efficient data structures and algorithms.
Mathematical summary for Digital Signal Processing Applications with Matlab consists of Mathematics which is not usually dealt in the DSP core subject, but used in DSP applications. Matlab programs with illustrations are given for the selective topics such as generation of Multivariate Gaussian distributed sample outcomes, Bacterial foraging algorithm, Newton's iteration, Steepest descent algorithm, etc. are given exclusively in the separate chapter. Also Mathematical summary for Digital Signal Processing Applications with Matlab is written in such a way that it is suitable for Non-Mathematical readers and is very much suitable for the beginners who are doing research in Digital Signal Processing.
As the conversion of legacy systems continues, the ability to understand embedded business rules becomes more and more critical. This ability is directly related to the structure of the programs within those systems. We also see the need to teach structured programming to a new generation of programmers who must maintain the billions of lines of existing COBOL code. The ultimate purpose of this text is to discuss how to judge the level of structure of a program. We do this by defining structured programming and then discussing how a structured program can be built through the application of the concepts of coupling and cohesion. We also show how embedded business rules of the program can be separated from the data and presentation functions. The reader will be able to use to these skills to judge and to improve the structure of a new program or an existing program.
This book explores coordination within and between teams in the context of large-scale agile software development, providing readers a deeper understanding of how coordinated action between teams is achieved in multiteam systems. An exploratory multiple case study with five multiteam systems and a total of 66 interviewees from development teams at SAP SE is presented and analyzed. In addition, the book explores stereotypes of coordination in large-scale agile settings and shares new perspectives on integrating conditions for coordination. No previous study has researched this topic with a similar data set, consisting of insights from professional software development teams. As such, the book will be of interest to all researchers and practitioners whose work involves software product development across several teams.
3D rotation analysis is widely encountered in everyday problems thanks to the development of computers. Sensing 3D using cameras and sensors, analyzing and modeling 3D for computer vision and computer graphics, and controlling and simulating robot motion all require 3D rotation computation. This book focuses on the computational analysis of 3D rotation, rather than classical motion analysis. It regards noise as random variables and models their probability distributions. It also pursues statistically optimal computation for maximizing the expected accuracy, as is typical of nonlinear optimization. All concepts are illustrated using computer vision applications as examples. Mathematically, the set of all 3D rotations forms a group denoted by SO(3). Exploiting this group property, we obtain an optimal solution analytical or numerically, depending on the problem. Our numerical scheme, which we call the "Lie algebra method," is based on the Lie group structure of SO(3). This book also proposes computing projects for readers who want to code the theories presented in this book, describing necessary 3D simulation setting as well as providing real GPS 3D measurement data. To help readers not very familiar with abstract mathematics, a brief overview of quaternion algebra, matrix analysis, Lie groups, and Lie algebras is provided as Appendix at the end of the volume.
This edited book presents scientific results of the 14th ACIS/IEEE International Conference on Software Engineering, Artificial Intelligence, Networking and Parallel/Distributed Computing (SNPD 2013), held in Honolulu, Hawaii, USA on July 1-3, 2013. The aim of this conference was to bring together scientists, engineers, computer users, and students to share their experiences and exchange new ideas, research results about all aspects (theory, applications and tools) of computer and information science, and to discuss the practical challenges encountered along the way and the solutions adopted to solve them. The conference organizers selected the 17 outstanding papers from those papers accepted for presentation at the conference.
Algorithms and Programming is primarily intended for use in a first-year undergraduate course in programming. It is structured in a problem-solution format that requires the student to think through the programming process, thus developing an understanding of the underlying theory. The book is easily readable by a student taking a basic introductory course in computer science as well as useful for a graduate-level course in the analysis of algorithms and/or compiler construction. Each self-contained chapter presents classical and well-known problems supplemented by clear and in-depth explanations. The material covered includes such topics as combinatorics, sorting, searching, queues, grammar and parsing, selected well-known algorithms and much more. Students and teachers will find this both an excellent text for learning programming and a source of problems for a variety of courses.
Hardware description languages (HDL) such as VHDL and Verilog have found their way into almost every aspect of the design of digital hardware systems. Since their inception they gradually proved to be an essential part of modern design methodologies and design automation tools, ever exceeding their original goals of being description and simulation languages. Their use for automatic synthesis, formal proof, and testing are good examples. So far, HDLs have been mainly dealing with digital systems. However, integrated systems designed today require more and more analog parts such as A/D and D/A converters, phase locked loops, current mirrors, etc. The verification of the complete system therefore asks for the use of a single language. Using VHDL or Verilog to handle analog descriptions is possible, as it is shown in this book, but the real power is coming from true mixed-signal HDLs that integrate discrete and continuous semantics into a unified framework. Analog HDLs (AHDL) are considered here a subset of mixed-signal HDLs as they intend to provide the same level of features as HDLs do but with a scope limited to analog systems, possibly with limited support of discrete semantics. Analog and Mixed-Signal Hardware Description Languages covers several aspects related to analog and mixed-signal hardware description languages including: The use of a digital HDL for the description and the simulation of analog systems The emergence of extensions of existing standard HDLs that provide true analog and mixed-signal HDLs. The use of analog and mixed-signal HDLs for the development of behavioral models of analog (electronic) building blocks (operational amplifier, PLL) and for the design of microsystems that do not only involve electronic parts. The use of a front-end tool that eases the description task with the help of a graphical paradigm, yet generating AHDL descriptions automatically. Analog and Mixed-Signal Hardware Description Languages is the first book to show how to use these new hardware description languages in the design of electronic components and systems. It is necessary reading for researchers and designers working in electronic design.
This book reviews the present understanding of the history of software and establishes an agenda for further research. By exploring this current understanding, the authors identify the fundamental elements of software. The problems and questions addressed in the book range from purely technical to societal issues. Thus, the articles presented offer a fresh view of this history with new categories and interrelated themes, comparing and contrasting software with artefacts in other disciplines, so as to ascertain in what ways software is similar to and different from other technologies.This volume is based on the international conference "Mapping the History of Computing: Software Issues", held in April 2000 at the Heinz Nixdorf Museums Forum in Paderborn, Germany.
The launch of Microsoft s Kinect, the first high-resolution depth-sensing camera for the consumer market, generated considerable excitement not only among computer gamers, but also within the global community of computer vision researchers. The potential of consumer depth cameras extends well beyond entertainment and gaming, to real-world commercial applications such virtual fitting rooms, training for athletes, and assistance for the elderly. This authoritative text/reference reviews the scope and impact of this rapidly growing field, describing the most promising Kinect-based research activities, discussing significant current challenges, and showcasing exciting applications. Topics and features: presents contributions from an international selection of preeminent authorities in their fields, from both academic and corporate research; addresses the classic problem of multi-view geometry of how to correlate images from different viewpoints to simultaneously estimate camera poses and world points; examines human pose estimation using video-rate depth images for gaming, motion capture, 3D human body scans, and hand pose recognition for sign language parsing; provides a review of approaches to various recognition problems, including category and instance learning of objects, and human activity recognition; with a Foreword by Dr. Jamie Shotton of Microsoft Research, Cambridge, UK. This broad-ranging overview is a must-read for researchers and graduate students of computer vision and robotics wishing to learn more about the state of the art of this increasingly hot topic."
With the growth of public and private data stores and the emergence of off-the-shelf data-mining technology, recommendation systems have emerged that specifically address the unique challenges of navigating and interpreting software engineering data. This book collects, structures and formalizes knowledge on recommendation systems in software engineering. It adopts a pragmatic approach with an explicit focus on system design, implementation, and evaluation. The book is divided into three parts: "Part I - Techniques" introduces basics for building recommenders in software engineering, including techniques for collecting and processing software engineering data, but also for presenting recommendations to users as part of their workflow."Part II - Evaluation" summarizes methods and experimental designs for evaluating recommendations in software engineering."Part III - Applications" describes needs, issues and solution concepts involved in entire recommendation systems for specific software engineering tasks, focusing on the engineering insights required to make effective recommendations. The book is complemented by the webpage rsse.org/book, which includes free supplemental materials for readers of this book and anyone interested in recommendation systems in software engineering, including lecture slides, data sets, source code, and an overview of people, groups, papers and tools with regard to recommendation systems in software engineering. The book is particularly well-suited for graduate students and researchers building new recommendation systems for software engineering applications or in other high-tech fields. It may also serve as the basis for graduate courses on recommendation systems, applied data mining or software engineering. Software engineering practitioners developing recommendation systems or similar applications with predictive functionality will also benefit from the broad spectrum of topics covered."
It gives me immense pleasure to introduce this timely handbook to the research/- velopment communities in the ?eld of signal processing systems (SPS). This is the ?rst of its kind and represents state-of-the-arts coverage of research in this ?eld. The driving force behind information technologies (IT) hinges critically upon the major advances in both component integration and system integration. The major breakthrough for the former is undoubtedly the invention of IC in the 50's by Jack S. Kilby, the Nobel Prize Laureate in Physics 2000. In an integrated circuit, all components were made of the same semiconductor material. Beginning with the pocket calculator in 1964, there have been many increasingly complex applications followed. In fact, processing gates and memory storage on a chip have since then grown at an exponential rate, following Moore's Law. (Moore himself admitted that Moore's Law had turned out to be more accurate, longer lasting and deeper in impact than he ever imagined. ) With greater device integration, various signal processing systems have been realized for many killer IT applications. Further breakthroughs in computer sciences and Internet technologies have also catalyzed large-scale system integration. All these have led to today's IT revolution which has profound impacts on our lifestyle and overall prospect of humanity. (It is hard to imagine life today without mobiles or Internets ) The success of SPS requires a well-concerted integrated approach from mul- ple disciplines, such as device, design, and application.
As miniaturisation deepens, and nanotechnology and its machines become more prevalent in the real world, the need to consider using quantum mechanical concepts to perform various tasks in computation increases. Such tasks include: the teleporting of information, breaking heretofore "unbreakable" codes, communicating with messages that betray eavesdropping, and the generation of random numbers. This is the first book to apply quantum physics to the basic operations of a computer, representing the ideal vehicle for explaining the complexities of quantum mechanics to students, researchers and computer engineers, alike, as they prepare to design and create the computing and information delivery systems for the future. Both authors have solid backgrounds in the subject matter at the theoretical and more practical level. While serving as a text for senior/grad level students in computer science/physics/engineering, this book has its primary use as an up-to-date reference work in the emerging interdisciplinary field of quantum computing - the only prerequisite being knowledge of calculus and familiarity with the concept of the Turing machine.
* With this book readers might well be able to build the next Mars Rover. * First book out on Java robotics. * The biggest selling point about this book is that no one else shows readers how to combine the power of their PC with a robust programming language in Java to create exciting robotics. * The book is a great teaching aid (in robotics or software) that establishes a new paradigm for thinking about robotics along with simpler ways to do things, i.e., vs. the old way using microcontrollers.
This edited book is dedicated to Professor N. U. Ahmed, a leading scholar and a renowned researcher in optimal control and optimization on the occasion of his retirement from the Department of Electrical Engineering at University of Ottawa in 1999. The contributions of this volume are in the areas of optimal control, non linear optimization and optimization applications. They are mainly the im proved and expanded versions of the papers selected from those presented in two special sessions of two international conferences. The first special session is Optimization Methods, which was organized by K. L. Teo and X. Q. Yang for the International Conference on Optimization and Variational Inequality, the City University of Hong Kong, Hong Kong, 1998. The other one is Optimal Control, which was organized byK. Teo and L. Caccetta for the Dynamic Control Congress, Ottawa, 1999. This volume is divided into three parts: Optimal Control; Optimization Methods; and Applications. The Optimal Control part is concerned with com putational methods, modeling and nonlinear systems. Three computational methods for solving optimal control problems are presented: (i) a regularization method for computing ill-conditioned optimal control problems, (ii) penalty function methods that appropriately handle final state equality constraints, and (iii) a multilevel optimization approach for the numerical solution of opti mal control problems. In the fourth paper, the worst-case optimal regulation involving linear time varying systems is formulated as a minimax optimal con trol problem."
In "Physical Unclonable Functions in Theory and Practice," the authorspresent an in-depth overview ofvarious topics concerning PUFs, providing theoretical background and application details. This book concentrates on the practical issues of PUF hardware design, focusing on dedicated microelectronic PUF circuits. Additionally, the authors discuss the whole process of circuit design, layout and chip verification. The book also offers coverage of: Different published approaches focusing on dedicated microelectronic PUF circuits Specification of PUF circuits General design issues Minimizing error rate from the circuit s perspective Transistor modeling issues of Montecarlo mismatch simulation and solutions Examples of PUF circuits including an accurate description of the circuits and testing/measurement resultsDifferent error rate reducing pre-selection techniques This monographgives insight into PUFs in general and provides knowledge in the field of PUF circuit design and implementation. It could be of interest for all circuit designers confronted with PUF design, and also for professionals and students being introduced to the topic."
This book provides an effective overview of the state-of-the art in software engineering, with a projection of the future of the discipline. It includes 13 papers, written by leading researchers in the respective fields, on important topics like model-driven software development, programming language design, microservices, software reliability, model checking and simulation. The papers are edited and extended versions of the presentations at the PAUSE symposium, which marked the completion of 14 years of work at the Chair of Software Engineering at ETH Zurich. In this inspiring context, some of the greatest minds in the field extensively discussed the past, present and future of software engineering. It guides readers on a voyage of discovery through the discipline of software engineering today, offering unique food for thought for researchers and professionals, and inspiring future research and development.
Looking back at the years that have passed since the realization of the very first electronic, multi-purpose computers, one observes a tremendous growth in hardware and software performance. Today, researchers and engi neers have access to computing power and software that can solve numerical problems which are not fully understood in terms of existing mathemati cal theory. Thus, computational sciences must in many respects be viewed as experimental disciplines. As a consequence, there is a demand for high quality, flexible software that allows, and even encourages, experimentation with alternative numerical strategies and mathematical models. Extensibil ity is then a key issue; the software must provide an efficient environment for incorporation of new methods and models that will be required in fu ture problem scenarios. The development of such kind of flexible software is a challenging and expensive task. One way to achieve these goals is to in vest much work in the design and implementation of generic software tools which can be used in a wide range of application fields. In order to provide a forum where researchers could present and discuss their contributions to the described development, an International Work shop on Modern Software Tools for Scientific Computing was arranged in Oslo, Norway, September 16-18, 1996. This workshop, informally referred to as Sci Tools '96, was a collaboration between SINTEF Applied Mathe matics and the Departments of Informatics and Mathematics at the Uni versity of Oslo."
What will be the next revolution in software technology to follow XML and generics? Whatever it may be, it is likely to come from functional programming, where many of the key ideas of the last decade originated. In this textbook, the leading researchers of the field take you on a tour of the current hot topics in functional programming, with applications ranging from financial contracts to circuit design. There are also chapters on new idioms for structuring programs, such as monads and arrows. All new concepts are illustrated with many examples, and exercises appear throughout to further enliven the material. Several of the chapters describe a substantial piece of software, and most of these programs are collected on a website for free downloading. This book was edited in honour of Richard S.Bird, Professor of Computer Science at the University of Oxford, on his 60th birthday.
The 14 contributed chapters in this book survey the most recent developments in high-performance algorithms for NGS data, offering fundamental insights and technical information specifically on indexing, compression and storage; error correction; alignment; and assembly. The book will be of value to researchers, practitioners and students engaged with bioinformatics, computer science, mathematics, statistics and life sciences. |
You may like...
Java How to Program, Late Objects…
Paul Deitel, Harvey Deitel
Paperback
C How to Program: With Case Studies in…
Paul Deitel, Harvey Deitel
Paperback
R2,192
Discovery Miles 21 920
C++ How to Program: Horizon Edition
Harvey Deitel, Paul Deitel
Paperback
R1,779
Discovery Miles 17 790
|