![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > General theory of computing
This book precisely formulates and simplifies the presentation of Instruction Level Parallelism (ILP) compilation techniques. It uniquely offers consistent and uniform descriptions of the code transformations involved. Due to the ubiquitous nature of ILP in virtually every processor built today, from general purpose CPUs to application-specific and embedded processors, this book is useful to the student, the practitioner and also the researcher of advanced compilation techniques. With an emphasis on fine-grain instruction level parallelism, this book will also prove interesting to researchers and students of parallelism at large, in as much as the techniques described yield insights that go beyond superscalar and VLIW (Very Long Instruction Word) machines compilation and are more widely applicable to optimizing compilers in general. ILP techniques have found wide and crucial application in Design Automation, where they have been used extensively in the optimization of performance as well as area and power minimization of computer designs.
This book covers all of the concepts required to tackle second-order cone programs (SOCPs), in order to provide the reader a complete picture of SOC functions and their applications. SOCPs have attracted considerable attention, due to their wide range of applications in engineering, data science, and finance. To deal with this special group of optimization problems involving second-order cones (SOCs), we most often need to employ the following crucial concepts: (i) spectral decomposition associated with SOCs, (ii) analysis of SOC functions, and (iii) SOC-convexity and -monotonicity. Moreover, we can roughly classify the related algorithms into two categories. One category includes traditional algorithms that do not use complementarity functions. Here, SOC-convexity and SOC-monotonicity play a key role. In contrast, complementarity functions are employed for the other category. In this context, complementarity functions are closely related to SOC functions; consequently, the analysis of SOC functions can help with these algorithms.
Computational Frameworks: Systems, Models and Applications provides an overview of advanced perspectives that bridges the gap between frontline research and practical efforts. It is unique in showing the interdisciplinary nature of this area and the way in which it interacts with emerging technologies and techniques. As computational systems are a dominating part of daily lives and a required support for most of the engineering sciences, this book explores their usage (e.g. big data, high performance clusters, databases and information systems, integrated and embedded hardware/software components, smart devices, mobile and pervasive networks, cyber physical systems, etc.).
Now, for the first time, publication of the landmark work in
backpropagation Scientists, engineers, statisticians, operations
researchers, and other investigators involved in neural networks
have long sought direct access to Paul Werbos's groundbreaking,
much-cited 1974 Harvard doctoral thesis, The Roots of
Backpropagation, which laid the foundation of backpropagation. Now,
with the publication of its full text, these practitioners can go
straight to the original material and gain a deeper, practical
understanding of this unique mathematical approach to social
studies and related fields. In addition, Werbos has provided three
more recent research papers, which were inspired by his original
work, and a new guide to the field. Originally written for readers
who lacked any knowledge of neural nets, The Roots of
Backpropagation firmly established both its historical and
continuing significance as it:
With the widespread knowledge and use of e-government, the intent and evaluation of e-government services continues to focus on meeting the needs and satisfaction of its citizens. E-Government Services Design, Adoption, and Evaluation is a comprehensive collection of research on assessment and implementation of electronic/digital government technologies in organizations. This book aims to supply academics, practitioners and professionals with the understanding of e-government and its applications and impact on organizations around the world.
This book takes a foundational approach to the semantics of probabilistic programming. It elaborates a rigorous Markov chain semantics for the probabilistic typed lambda calculus, which is the typed lambda calculus with recursion plus probabilistic choice. The book starts with a recapitulation of the basic mathematical tools needed throughout the book, in particular Markov chains, graph theory and domain theory, and also explores the topic of inductive definitions. It then defines the syntax and establishes the Markov chain semantics of the probabilistic lambda calculus and, furthermore, both a graph and a tree semantics. Based on that, it investigates the termination behavior of probabilistic programs. It introduces the notions of termination degree, bounded termination and path stoppability and investigates their mutual relationships. Lastly, it defines a denotational semantics of the probabilistic lambda calculus, based on continuous functions over probability distributions as domains. The work mostly appeals to researchers in theoretical computer science focusing on probabilistic programming, randomized algorithms, or programming language theory.
Ada's Legacy illustrates the depth and diversity of writers, thinkers, and makers who have been inspired by Ada Lovelace, the English mathematician and writer. The volume, which commemorates the bicentennial of Ada's birth in December 1815, celebrates Lovelace's many achievements as well as the impact of her life and work, which reverberated widely since the late nineteenth century. In the 21st century we have seen a resurgence in Lovelace scholarship, thanks to the growth of interdisciplinary thinking and the expanding influence of women in science, technology, engineering and mathematics. Ada's Legacy is a unique contribution to this scholarship, thanks to its combination of papers on Ada's collaboration with Charles Babbage, Ada's position in the Victorian and Steampunk literary genres, Ada's representation in and inspiration of contemporary art and comics, and Ada's continued relevance in discussions around gender and technology in the digital age. With the 200th anniversary of Ada Lovelace's birth on December 10, 2015, we believe that the timing is perfect to publish this collection of papers. Because of its broad focus on subjects that reach far beyond the life and work of Ada herself, Ada's Legacy will appeal to readers who are curious about Ada's enduring importance in computing and the wider world.
In recent years, we have observed that many educational systems, especially intelligent tutoring systems, are being implemented according to an agent paradigm. Therefore, researchers in education believe that the educational computing environments would be more pedagogically effective if they had mechanisms to show and recognize the student's emotions. ""Agent-based Tutoring Systems by Cognitive and Affective Modeling"" intends to present a modern view of intelligent tutoring, focusing mainly on the conception of these systems according to a multi-agent approach and on the affective and cognitive modeling of the student in this kind of educational environment. Providing researchers, academicians, educators, and practitioners with a critical mass of research on the theory, practice, development, and implementation of tools for knowledge representation and agent-based architectures, this Premier Reference Source is a must-have addition to every library collection.
The third edition of this handbook is designed to provide a broad coverage of the concepts, implementations, and applications in metaheuristics. The book's chapters serve as stand-alone presentations giving both the necessary underpinnings as well as practical guides for implementation. The nature of metaheuristics invites an analyst to modify basic methods in response to problem characteristics, past experiences, and personal preferences, and the chapters in this handbook are designed to facilitate this process as well. This new edition has been fully revised and features new chapters on swarm intelligence and automated design of metaheuristics from flexible algorithm frameworks. The authors who have contributed to this volume represent leading figures from the metaheuristic community and are responsible for pioneering contributions to the fields they write about. Their collective work has significantly enriched the field of optimization in general and combinatorial optimization in particular.Metaheuristics are solution methods that orchestrate an interaction between local improvement procedures and higher level strategies to create a process capable of escaping from local optima and performing a robust search of a solution space. In addition, many new and exciting developments and extensions have been observed in the last few years. Hybrids of metaheuristics with other optimization techniques, like branch-and-bound, mathematical programming or constraint programming are also increasingly popular. On the front of applications, metaheuristics are now used to find high-quality solutions to an ever-growing number of complex, ill-defined real-world problems, in particular combinatorial ones. This handbook should continue to be a great reference for researchers, graduate students, as well as practitioners interested in metaheuristics.
This volume is proceedings of the international conference of the Parallel Computational Fluid Dynamics 2002. In the volume, up-to-date information about numerical simulations of flows using parallel computers is given by leading researchers in this field. Special topics are "Grid Computing" and "Earth Simulator." Grid computing is now the most exciting topic in computer science. An invited paper on grid computing is presented in the volume. The Earth-Simulator is now the fastest computer in the world. Papers on flow-simulations using the Earth-Simulator are also included, as well as a thirty-two page special tutorial article on numerical optimization.
This textbook provides concise coverage of the basics of linear and integer programming which, with megatrends toward optimization, machine learning, big data, etc., are becoming fundamental toolkits for data and information science and technology. The authors' approach is accessible to students from almost all fields of engineering, including operations research, statistics, machine learning, control system design, scheduling, formal verification and computer vision. The presentations enables the basis for numerous approaches to solving hard combinatorial optimization problems through randomization and approximation. Readers will learn to cast various problems that may arise in their research as optimization problems, understand the cases where the optimization problem will be linear, choose appropriate solution methods and interpret results appropriately.
It has been over twenty years since developments in actor-network theory were first written on paper. Since then, the Information and Communication Technologies (ICT) community has begun to discover the power of using actor-network theory as an explanatory framework for much of its research. This research community has come to an understanding that information systems are, of necessity, socio-technical in nature and require a socio-technical approach to their investigation. Thanks to developments in actor-network theory, researchers can now approach people and technology as one single entity that gives support to social influences on technological innovations. Social Influences on Information and Communication Technology Innovations discusses in great detail the use of actor-network theory in offering explanations for socio-technical phenomena, focusing greatly on information communication technologies. Implementation and use of information and communication technologies inevitably involves the interactions of both technology and people. This publication facilitates international growth in the body of research investigating the value of using actor-network theory as a means of understanding socio-technical phenomena and technological innovation.
This book provides a comprehensive account of the glowworm swarm optimization (GSO) algorithm, including details of the underlying ideas, theoretical foundations, algorithm development, various applications, and MATLAB programs for the basic GSO algorithm. It also discusses several research problems at different levels of sophistication that can be attempted by interested researchers. The generality of the GSO algorithm is evident in its application to diverse problems ranging from optimization to robotics. Examples include computation of multiple optima, annual crop planning, cooperative exploration, distributed search, multiple source localization, contaminant boundary mapping, wireless sensor networks, clustering, knapsack, numerical integration, solving fixed point equations, solving systems of nonlinear equations, and engineering design optimization. The book is a valuable resource for researchers as well as graduate and undergraduate students in the area of swarm intelligence and computational intelligence and working on these topics.
The advancement of information technology is becoming more prevalent in all aspects of the world today, including online environments. Understanding technology's effect on niche markets and all fields of research is crucial for practitioners in this area. Contemporary Advancements in Information Technology Development in Dynamic Environments presents an in-depth discussion into the information technology revolution present in fields such as government, gaming, social networking, and cloud computing. This book's investigation into the research and application of information technology in several specific areas make this a useful resource for practitioners, professionals, undergraduate/graduate students, and academics.
This volume contains proceedings of two conferences held in Toronto (Canada) and Kozhikode (India) in 2016 in honor of the 60th birthday of Professor Kumar Murty. The meetings were focused on several aspects of number theory: The theory of automorphic forms and their associated L-functions Arithmetic geometry, with special emphasis on algebraic cycles, Shimura varieties, and explicit methods in the theory of abelian varieties The emerging applications of number theory in information technology Kumar Murty has been a substantial influence in these topics, and the two conferences were aimed at honoring his many contributions to number theory, arithmetic geometry, and information technology.
This book puts in one place and in accessible form Richard Berk's most recent work on forecasts of re-offending by individuals already in criminal justice custody. Using machine learning statistical procedures trained on very large datasets, an explicit introduction of the relative costs of forecasting errors as the forecasts are constructed, and an emphasis on maximizing forecasting accuracy, the author shows how his decades of research on the topic improves forecasts of risk. Criminal justice risk forecasts anticipate the future behavior of specified individuals, rather than "predictive policing" for locations in time and space, which is a very different enterprise that uses different data different data analysis tools. The audience for this book includes graduate students and researchers in the social sciences, and data analysts in criminal justice agencies. Formal mathematics is used only as necessary or in concert with more intuitive explanations.
This volume of selected and peer-reviewed contributions on the latest developments in time series analysis and forecasting updates the reader on topics such as analysis of irregularly sampled time series, multi-scale analysis of univariate and multivariate time series, linear and non-linear time series models, advanced time series forecasting methods, applications in time series analysis and forecasting, advanced methods and online learning in time series and high-dimensional and complex/big data time series. The contributions were originally presented at the International Work-Conference on Time Series, ITISE 2016, held in Granada, Spain, June 27-29, 2016. The series of ITISE conferences provides a forum for scientists, engineers, educators and students to discuss the latest ideas and implementations in the foundations, theory, models and applications in the field of time series analysis and forecasting. It focuses on interdisciplinary and multidisciplinary research encompassing the disciplines of computer science, mathematics, statistics and econometrics.
|
You may like...
What is Intelligence? - Contemporary…
Robert J. Sternberg, Douglas K. Detterman
Hardcover
R2,047
Discovery Miles 20 470
Student Modelling: The Key to…
Jim E. Greer, Gordon I. McCalla
Hardcover
R5,358
Discovery Miles 53 580
The Intelligent Mind - On the Genesis…
Richard Dien Winfield
Hardcover
R1,859
Discovery Miles 18 590
Aptitude, Learning, and Instruction…
Richard E Snow, Pat-Anthony Federico, …
Hardcover
R4,094
Discovery Miles 40 940
|