![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > General theory of computing > Data structures
This book covers the issues related to optimization of engineering and management problems using soft computing techniques with an industrial outlook. It covers a broad area related to real life complex decision making problems using a heuristics approach. It also explores a wide perspective and future directions in industrial engineering research on a global platform/scenario. The book highlights the concept of optimization, presents various soft computing techniques, offers sample problems, and discusses related software programs complete with illustrations. Features Explains the concept of optimization and relevance to soft computing techniques towards optimal solution in engineering and management Presents various soft computing techniques Offers problems and their optimization using various soft computing techniques Discusses related software programs, with illustrations Provides a step-by-step tutorial on how to handle relevant software for obtaining the optimal solution to various engineering problems
Originally published in 1995, Large Deviations for Performance Analysis consists of two synergistic parts. The first half develops the theory of large deviations from the beginning, through recent results on the theory for processes with boundaries, keeping to a very narrow path: continuous-time, discrete-state processes. By developing only what is needed for the applications, the theory is kept to a manageable level, both in terms of length and in terms of difficulty. Within its scope, the treatment is detailed, comprehensive and self-contained. As the book shows, there are sufficiently many interesting applications of jump Markov processes to warrant a special treatment. The second half is a collection of applications developed at Bell Laboratories. The applications cover large areas of the theory of communication networks: circuit switched transmission, packet transmission, multiple access channels, and the M/M/1 queue. Aspects of parallel computation are covered as well including, basics of job allocation, rollback-based parallel simulation, assorted priority queueing models that might be used in performance models of various computer architectures, and asymptotic coupling of processors. These applications are thoroughly analysed using the tools developed in the first half of the book.
Designed for a proof-based course on linear algebra, this rigorous and concise textbook intentionally introduces vector spaces, inner products, and vector and matrix norms before Gaussian elimination and eigenvalues so students can quickly discover the singular value decomposition (SVD)-arguably the most enlightening and useful of all matrix factorizations. Gaussian elimination is then introduced after the SVD and the four fundamental subspaces and is presented in the context of vector spaces rather than as a computational recipe. This allows the authors to use linear independence, spanning sets and bases, and the four fundamental subspaces to explain and exploit Gaussian elimination and the LU factorization, as well as the solution of overdetermined linear systems in the least squares sense and eigenvalues and eigenvectors. This unique textbook also includes examples and problems focused on concepts rather than the mechanics of linear algebra. The problems at the end of each chapter and in an associated website encourage readers to explore how to use the notions introduced in the chapter in a variety of ways. Additional problems, quizzes, and exams will be posted on an accompanying website and updated regularly. The Less Is More Linear Algebra of Vector Spaces and Matrices is for students and researchers interested in learning linear algebra who have the mathematical maturity to appreciate abstract concepts that generalize intuitive ideas. The early introduction of the SVD makes the book particularly useful for those interested in using linear algebra in applications such as scientific computing and data science. It is appropriate for a first proof-based course in linear algebra.
This book focuses on lattice-based cryptosystems, widely considered to be one of the most promising post-quantum cryptosystems and provides fundamental insights into how to construct provably secure cryptosystems from hard lattice problems. The concept of provable security is used to inform the choice of lattice tool for designing cryptosystems, including public-key encryption, identity-based encryption, attribute-based encryption, key change and digital signatures. Given its depth of coverage, the book especially appeals to graduate students and young researchers who plan to enter this research area.
Cryptographic applications, such as RSA algorithm, ElGamal cryptography, elliptic curve cryptography, Rabin cryptosystem, Diffie -Hellmann key exchange algorithm, and the Digital Signature Standard, use modular exponentiation extensively. The performance of all these applications strongly depends on the efficient implementation of modular exponentiation and modular multiplication. Since 1984, when Montgomery first introduced a method to evaluate modular multiplications, many algorithmic modifications have been done for improving the efficiency of modular multiplication, but very less work has been done on the modular exponentiation to improve the efficiency. This research monograph addresses the question- how can the performance of modular exponentiation, which is the crucial operation of many public-key cryptographic techniques, be improved? The book focuses on Energy Efficient Modular Exponentiations for Cryptographic hardware. Spread across five chapters, this well-researched text focuses in detail on the Bit Forwarding Techniques and the corresponding hardware realizations. Readers will also discover advanced performance improvement techniques based on high radix multiplication and Cryptographic hardware based on multi-core architectures.
This book combines the advantages of high-dimensional data visualization and machine learning in the context of identifying complex n-D data patterns. It vastly expands the class of reversible lossless 2-D and 3-D visualization methods, which preserve the n-D information. This class of visual representations, called the General Lines Coordinates (GLCs), is accompanied by a set of algorithms for n-D data classification, clustering, dimension reduction, and Pareto optimization. The mathematical and theoretical analyses and methodology of GLC are included, and the usefulness of this new approach is demonstrated in multiple case studies. These include the Challenger disaster, world hunger data, health monitoring, image processing, text classification, market forecasts for a currency exchange rate, computer-aided medical diagnostics, and others. As such, the book offers a unique resource for students, researchers, and practitioners in the emerging field of Data Science.
While Computer Security is a broader term which incorporates technologies, protocols, standards and policies to ensure the security of the computing systems including the computer hardware, software and the information stored in it, Cyber Security is a specific, growing field to protect computer networks (offline and online) from unauthorized access, botnets, phishing scams, etc. Machine learning is a branch of Computer Science which enables computing machines to adopt new behaviors on the basis of observable and verifiable data and information. It can be applied to ensure the security of the computers and the information by detecting anomalies using data mining and other such techniques. This book will be an invaluable resource to understand the importance of machine learning and data mining in establishing computer and cyber security. It emphasizes important security aspects associated with computer and cyber security along with the analysis of machine learning and data mining based solutions. The book also highlights the future research domains in which these solutions can be applied. Furthermore, it caters to the needs of IT professionals, researchers, faculty members, scientists, graduate students, research scholars and software developers who seek to carry out research and develop combating solutions in the area of cyber security using machine learning based approaches. It is an extensive source of information for the readers belonging to the field of Computer Science and Engineering, and Cyber Security professionals. Key Features: This book contains examples and illustrations to demonstrate the principles, algorithms, challenges and applications of machine learning and data mining for computer and cyber security. It showcases important security aspects and current trends in the field. It provides an insight of the future research directions in the field. Contents of this book help to prepare the students for exercising better defense in terms of understanding the motivation of the attackers and how to deal with and mitigate the situation using machine learning based approaches in better manner.
Realism and Complexity in Social Science is an argument for a new approach to investigating the social world, that of complex realism. Complex realism brings together a number of strands of thought, in scientific realism, complexity science, probability theory and social research methodology. It proposes that the reality of the social world is that it is probabilistic, yet there exists enough invariance to make the discovery and explanation of social objects and causal mechanisms possible. This forms the basis for the development of a complex realist foundation for social research, that utilises a number of new and novel approaches to investigation, alongside the more traditional corpus of quantitative and qualitative methods. Research examples are drawn from research in sociology, epidemiology, criminology, social policy and human geography. The book assumes no prior knowledge of realism, probability or complexity and in the early chapters, the reader is introduced to these concepts and the arguments against them. Although the book is grounded in philosophical reasoning, this is in a direct and accessible style that will appeal both to social researchers with a methodological interest and philosophers with an interest in social investigation.
We live in an algorithmic society. Algorithms have become the main mediator through which power is enacted in our society. This book brings together three academic fields - Public Administration, Criminal Justice and Urban Governance - into a single conceptual framework, and offers a broad cultural-political analysis, addressing critical and ethical issues of algorithms. Governments are increasingly turning towards algorithms to predict criminality, deliver public services, allocate resources, and calculate recidivism rates. Mind-boggling amounts of data regarding our daily actions are analysed to make decisions that manage, control, and nudge our behaviour in everyday life. The contributions in this book offer a broad analysis of the mechanisms and social implications of algorithmic governance. Reporting from the cutting edge of scientific research, the result is illuminating and useful for understanding the relations between algorithms and power.Topics covered include: Algorithmic governmentality Transparency and accountability Fairness in criminal justice and predictive policing Principles of good digital administration Artificial Intelligence (AI) in the smart city This book is essential reading for students and scholars of Sociology, Criminology, Public Administration, Political Sciences, and Cultural Theory interested in the integration of algorithms into the governance of society.
Discrete Mathematics for Computer Science: An Example-Based Introduction is intended for a first- or second-year discrete mathematics course for computer science majors. It covers many important mathematical topics essential for future computer science majors, such as algorithms, number representations, logic, set theory, Boolean algebra, functions, combinatorics, algorithmic complexity, graphs, and trees. Features Designed to be especially useful for courses at the community-college level Ideal as a first- or second-year textbook for computer science majors, or as a general introduction to discrete mathematics Written to be accessible to those with a limited mathematics background, and to aid with the transition to abstract thinking Filled with over 200 worked examples, boxed for easy reference, and over 200 practice problems with answers Contains approximately 40 simple algorithms to aid students in becoming proficient with algorithm control structures and pseudocode Includes an appendix on basic circuit design which provides a real-world motivational example for computer science majors by drawing on multiple topics covered in the book to design a circuit that adds two eight-digit binary numbers Jon Pierre Fortney graduated from the University of Pennsylvania in 1996 with a BA in Mathematics and Actuarial Science and a BSE in Chemical Engineering. Prior to returning to graduate school, he worked as both an environmental engineer and as an actuarial analyst. He graduated from Arizona State University in 2008 with a PhD in Mathematics, specializing in Geometric Mechanics. Since 2012, he has worked at Zayed University in Dubai. This is his second mathematics textbook.
The introduction of public key cryptography (PKC) was a critical advance in IT security. In contrast to symmetric key cryptography, it enables confidential communication between entities in open networks, in particular the Internet, without prior contact. Beyond this PKC also enables protection techniques that have no analogue in traditional cryptography, most importantly digital signatures which for example support Internet security by authenticating software downloads and updates. Although PKC does not require the confidential exchange of secret keys, proper management of the private and public keys used in PKC is still of vital importance: the private keys must remain private, and the public keys must be verifiably authentic. So understanding so-called public key infrastructures (PKIs) that manage key pairs is at least as important as studying the ingenious mathematical ideas underlying PKC. In this book the authors explain the most important concepts underlying PKIs and discuss relevant standards, implementations, and applications. The book is structured into chapters on the motivation for PKI, certificates, trust models, private keys, revocation, validity models, certification service providers, certificate policies, certification paths, and practical aspects of PKI. This is a suitable textbook for advanced undergraduate and graduate courses in computer science, mathematics, engineering, and related disciplines, complementing introductory courses on cryptography. The authors assume only basic computer science prerequisites, and they include exercises in all chapters and solutions in an appendix. They also include detailed pointers to relevant standards and implementation guidelines, so the book is also appropriate for self-study and reference by industrial and academic researchers and practitioners.
Anyone Can Code: The Art and Science of Logical Creativity introduces computer programming as a way of problem-solving through logical thinking. It uses the notion of modularization as a central lens through which we can make sense of many software concepts. This book takes the reader through fundamental concepts in programming by illustrating them in three different and distinct languages: C/C++, Python, and Javascript. Key features: Focuses on problem-solving and algorithmic thinking instead of programming functions, syntax, and libraries; Includes engaging examples, including video games and visual effects; Provides exercises and reflective questions. This book gives beginner and intermediate learners a strong understanding of what they are doing so that they can do it better and with any other tool or language that they may end up using later.
This book discusses an important area of numerical optimization, called interior-point method. This topic has been popular since the 1980s when people gradually realized that all simplex algorithms were not convergent in polynomial time and many interior-point algorithms could be proved to converge in polynomial time. However, for a long time, there was a noticeable gap between theoretical polynomial bounds of the interior-point algorithms and efficiency of these algorithms. Strategies that were important to the computational efficiency became barriers in the proof of good polynomial bounds. The more the strategies were used in algorithms, the worse the polynomial bounds became. To further exacerbate the problem, Mehrotra's predictor-corrector (MPC) algorithm (the most popular and efficient interior-point algorithm until recently) uses all good strategies and fails to prove the convergence. Therefore, MPC does not have polynomiality, a critical issue with the simplex method. This book discusses recent developments that resolves the dilemma. It has three major parts. The first, including Chapters 1, 2, 3, and 4, presents some of the most important algorithms during the development of the interior-point method around the 1990s, most of them are widely known. The main purpose of this part is to explain the dilemma described above by analyzing these algorithms' polynomial bounds and summarizing the computational experience associated with them. The second part, including Chapters 5, 6, 7, and 8, describes how to solve the dilemma step-by-step using arc-search techniques. At the end of this part, a very efficient algorithm with the lowest polynomial bound is presented. The last part, including Chapters 9, 10, 11, and 12, extends arc-search techniques to some more general problems, such as convex quadratic programming, linear complementarity problem, and semi-definite programming.
Justice apps - mobile and web-based programmes that can assist individuals with legal tasks - are being produced, improved, and accessed at an unprecedented rate. These technologies have the potential to reshape the justice system, improve access to justice, and demystify legal institutions. Using artificial intelligence techniques, apps can even facilitate the resolution of common legal disputes. However, these opportunities must be assessed in light of the many challenges associated with app use in the justice sector. These include the digital divide and other accessibility issues; the ethical challenges raised by the dehumanisation of legal processes; and various privacy, security, and confidentiality risks. Surveying the landscape of this emergent industry, this book explores the objectives, opportunities, and challenges presented by apps across all areas of the justice sector. Detailed consideration is also given to the use of justice apps in specific legal contexts, including the family law and criminal law sectors. The first book to engage with justice apps, this book will appeal to a wide range of legal scholars, students, practitioners, and policy-makers.
This compendium provides a detailed account of the lognormality principle characterizing the human motor behavior by summarizing a sound theoretical framework for modeling such a behavior, introducing the most recent algorithms for extracting the lognormal components of complex movements in 2, 2.5 and 3 dimensions. It also vividly reports the most advanced applications to handwriting analysis and recognition, signature and writer verification, gesture recognition and calligraphy generation, evaluation of motor skills, improvement/degradation with aging, handwriting learning, education and developmental deficits, prescreening of children with ADHD (Attention Development and Hyperactivity Disorder), monitoring of concussion recovery, diagnosis and monitoring of Alzheimer's and Parkinson's diseases and aging effects in speech and handwriting.The volume provides a unique and useful source of references on the lognormality principle, an update on the most recent advances and an outlook at the most promising future developments in e-Security, e-Learning and e-Health.
With the growing interest in and use of big data analytics in many industries and in many research fields around the globe, this new volume addresses the need for a comprehensive resource on the core concepts of big data analytics along with the tools, techniques, and methodologies. The book gives the why and the how of big data analytics in an organized and straightforward manner, using both theoretical and practical approaches. The book's authors have organized the contents in a systematic manner, starting with an introduction and overview of big data analytics and then delving into pre-processing methods, feature selection methods and algorithms, big data streams, and big data classification. Such terms and methods as swarm intelligence, data mining, the bat algorithm and genetic algorithms, big data streams, and many more are discussed. The authors explain how deep learning and machine learning along with other methods and tools are applied in big data analytics. The last section of the book presents a selection of illustrative case studies that show examples of the use of data analytics in industries such as health care, business, education, and social media. Research Practitioner's Handbook on Big Data Analytics will be a valuable addition to the libraries of practitioners in data collection in many industries along with research scholars and faculty in the domain of big data analytics. The book can also serve as a handy textbook for courses in data collection, data mining, and big data analytics.
Recommender systems provide users (businesses or individuals) with personalized online recommendations of products or information, to address the problem of information overload and improve personalized services. Recent successful applications of recommender systems are providing solutions to transform online services for e-government, e-business, e-commerce, e-shopping, e-library, e-learning, e-tourism, and more.This unique compendium not only describes theoretical research but also reports on new application developments, prototypes, and real-world case studies of recommender systems. The comprehensive volume provides readers with a timely snapshot of how new recommendation methods and algorithms can overcome challenging issues. Furthermore, the monograph systematically presents three dimensions of recommender systems - basic recommender system concepts, advanced recommender system methods, and real-world recommender system applications.By providing state-of-the-art knowledge, this excellent reference text will immensely benefit researchers, managers, and professionals in business, government, and education to understand the concepts, methods, algorithms and application developments in recommender systems.
Multi-objective optimization problems (MOPs) and uncertain optimization problems (UOPs) which widely exist in real life are challengeable problems in the fields of decision making, system designing, and scheduling, amongst others. Decomposition exploits the ideas of aEURO~making things simpleaEURO (TM) and aEURO~divide and conqueraEURO (TM) to transform a complex problem into a series of simple ones with the aim of reducing the computational complexity. In order to tackle the abovementioned two types of complicated optimization problems, this book introduces the decomposition strategy and conducts a systematic study to perfect the usage of decomposition in the field of multi-objective optimization, and extend the usage of decomposition in the field of uncertain optimization.
Provides complete update and organization of the previous books, with some material moving online; Includes new problems, projects, and exercises; Includes interactive coding resources to accompany the book, including examples in the text, exercises, projects, and refection questions.
This book acquaints readers with recent developments in dynamical systems theory and its applications, with a strong focus on the control and estimation of nonlinear systems. Several algorithms are proposed and worked out for a set of model systems, in particular so-called input-affine or bilinear systems, which can serve to approximate a wide class of nonlinear control systems. These can either take the form of state space models or be represented by an input-output equation. The approach taken here further highlights the role of modern mathematical and conceptual tools, including differential algebraic theory, observer design for nonlinear systems and generalized canonical forms.
This book provides insights into contemporary issues and challenges in soft computing applications and techniques in healthcare. It will be a useful guide to identify, categorise and assess the role of different soft computing techniques for disease, diagnosis and prediction due to technological advancements. The book explores applications in soft computing and covers empirical properties of artificial neural network (ANN), evolutionary computing, fuzzy logic and statistical techniques. It presents basic and advanced concepts to help beginners and industry professionals get up to speed on the latest developments in soft computing and healthcare systems. It incorporates the latest methodologies and challenges facing soft computing, examines descriptive, predictive and social network techniques and discusses analytics tools and their role in providing effective solutions for science and technology. The primary users for the book include researchers, academicians, postgraduate students, specialists and practitioners. Dr. Ashish Mishra is a professor in the Department of Computer Science and Engineering, Gyan Ganga Institute of Technology and Sciences, Jabalpur, Madhya Pradesh, India. He has contributed in organising the INSPIRE Science Internship Camp. He is a member of the Institute of Electrical and Electronics Engineers and is a life member of the Computer Society of India. His research interests include the Internet of Things, data mining, cloud computing, image processing and knowledge-based systems. He holds nine patents in Intellectual Property, India. He has authored four books in the areas of data mining, image processing and LaTex. Dr. G. Suseendran is an assistant professor, Department of Information Technology, School of Computing Sciences, Vels Institute of Science, Technology & Advanced Studies (VISTAS), Chennai, Tamil Nadu, India. His research interests include ad-hoc networks, the Internet of Things, data mining, cloud computing, image processing, knowledge-based systems, and Web information exploration. He has published more than 75 research papers in various international journals such as Science Citation Index, Springer Book Chapter, Scopus, IEEE Access and UGC-referred journals. Prof. Trung-Nghia Phung is an associate professor and Head of Academic Affairs, Thai Nguyen University of Information and Communication Technology (ICTU). He has published more than 60 research papers. His main research interest lies in the field of speech, audio, and biomedical signal processing. He serves as a technical committee program member, track chair, session chair, and reviewer of many international conferences and journals. He was a co-Chair of the International Conference on Advances in Information and Communication Technology 2016 (ICTA 2016) and a Session Chair of the 4th International Conference on Information System Design and Intelligent Applications (INDIA 2017).
This book is an essential tool written to be used as the primary text for an undergraduate or early postgraduate course as well as a reference book for engineers and scientists who want to quickly develop finite-element programs. Regarding the formulation of the finite element method, the book emphasizes the essential unity of all processes of approximation used in the solution of differential equations such as finite differences, finite elements and boundary elements. Computational aspects are presented in Maple. Three Maple packages were specially developed for this book and are included in a companion CD-ROM.
This book discusses the role of mobile network data in urban informatics, particularly how mobile network data is utilized in the mobility context, where approaches, models, and systems are developed for understanding travel behavior. The objectives of this book are thus to evaluate the extent to which mobile network data reflects travel behavior and to develop guidelines on how to best use such data to understand and model travel behavior. To achieve these objectives, the book attempts to evaluate the strengths and weaknesses of this data source for urban informatics and its applicability to the development and implementation of travel behavior models through a series of the authors' research studies. Traditionally, survey-based information is used as an input for travel demand models that predict future travel behavior and transportation needs. A survey-based approach is however costly and time-consuming, and hence its information can be dated and limited to a particular region. Mobile network data thus emerges as a promising alternative data source that is massive in both cross-sectional and longitudinal perspectives, and one that provides both broader geographic coverage of travelers and longer-term travel behavior observation. The two most common types of travel demand model that have played an essential role in managing and planning for transportation systems are four-step models and activity-based models. The book's chapters are structured on the basis of these travel demand models in order to provide researchers and practitioners with an understanding of urban informatics and the important role that mobile network data plays in advancing the state of the art from the perspectives of travel behavior research.
The vast circulations of mobile devices, sensors and data mean that the social world is now defined by a complex interweaving of human and machine agency. Key to this is the growing power of algorithms - the decision-making parts of code - in our software dense and data rich environments. Algorithms can shape how we are retreated, what we know, who we connect with and what we encounter, and they present us with some important questions about how society operates and how we understand it. This book offers a series of concepts, approaches and ideas for understanding the relations between algorithms and power. Each chapter provides a unique perspective on the integration of algorithms into the social world. As such, this book directly tackles some of the most important questions facing the social sciences today. This book was originally published as a special issue of Information, Communication & Society.
Distributed Systems: An Algorithmic Approach, Second Edition provides a balanced and straightforward treatment of the underlying theory and practical applications of distributed computing. As in the previous version, the language is kept as unobscured as possible-clarity is given priority over mathematical formalism. This easily digestible text: Features significant updates that mirror the phenomenal growth of distributed systems Explores new topics related to peer-to-peer and social networks Includes fresh exercises, examples, and case studies Supplying a solid understanding of the key principles of distributed computing and their relationship to real-world applications, Distributed Systems: An Algorithmic Approach, Second Edition makes both an ideal textbook and a handy professional reference. |
You may like...
Living While Black - The Essential Guide…
Guilaine Kinouani
Paperback
|