Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Computing & IT > General theory of computing > Data structures
Reviews the literature of the Moth-Flame Optimization algorithm; Provides an in-depth analysis of equations, mathematical models, and mechanisms of the Moth-Flame Optimization algorithm; Proposes different variants of the Moth-Flame Optimization algorithm to solve binary, multi-objective, noisy, dynamic, and combinatorial optimization problems; Demonstrates how to design, develop, and test different hybrids of Moth-Flame Optimization algorithm; Introduces several applications areas of the Moth-Flame Optimization algorithm focusing in sustainability.
This book provides a unified framework that describes how genetic learning can be used to design pattern recognition and learning systems. It examines how a search technique, the genetic algorithm, can be used for pattern classification mainly through approximating decision boundaries. Coverage also demonstrates the effectiveness of the genetic classifiers vis-a-vis several widely used classifiers, including neural networks. "
Due to efficacy and optimization potential of genetic and evolutionary algorithms, they are used in learning and modeling especially with the advent of big data related problems. This book presents the algorithms and strategies specifically associated with pertinent issues in materials science domain. It discusses the procedures for evolutionary multi-objective optimization of objective functions created through these procedures and introduces available codes. Recent applications ranging from primary metal production to materials design are covered. It also describes hybrid modeling strategy, and other common modeling and simulation strategies like molecular dynamics, cellular automata etc. Features: Focuses on data-driven evolutionary modeling and optimization, including evolutionary deep learning. Include details on both algorithms and their applications in materials science and technology. Discusses hybrid data-driven modeling that couples evolutionary algorithms with generic computing strategies. Thoroughly discusses applications of pertinent strategies in metallurgy and materials. Provides overview of the major single and multi-objective evolutionary algorithms. This book aims at Researchers, Professionals, and Graduate students in Materials Science, Data-Driven Engineering, Metallurgical Engineering, Computational Materials Science, Structural Materials, and Functional Materials.
Activity recognition has emerged as a challenging and high-impact research field, as over the past years smaller and more powerful sensors have been introduced in wide-spread consumer devices. Validation of techniques and algorithms requires large-scale human activity corpuses and improved methods to recognize activities and the contexts in which they occur. This book deals with the challenges of designing valid and reproducible experiments, running large-scale dataset collection campaigns, designing activity and context recognition methods that are robust and adaptive, and evaluating activity recognition systems in the real world with real users.
This book focuses on the development of approximation-related algorithms and their relevant applications. Individual contributions are written by leading experts and reflect emerging directions and connections in data approximation and optimization. Chapters discuss state of the art topics with highly relevant applications throughout science, engineering, technology and social sciences. Academics, researchers, data science practitioners, business analysts, social sciences investigators and graduate students will find the number of illustrations, applications, and examples provided useful. This volume is based on the conference Approximation and Optimization: Algorithms, Complexity, and Applications, which was held in the National and Kapodistrian University of Athens, Greece, June 29-30, 2017. The mix of survey and research content includes topics in approximations to discrete noisy data; binary sequences; design of networks and energy systems; fuzzy control; large scale optimization; noisy data; data-dependent approximation; networked control systems; machine learning ; optimal design; no free lunch theorem; non-linearly constrained optimization; spectroscopy.
This book introduces a new scheduler to fairly and efficiently distribute system resources to many users of varying usage patterns compete for them in large shared computing environments. The Rawlsian Fair scheduler developed for this effort is shown to boost performance while reducing delay in high performance computing workloads of certain types including the following four types examined in this book: i. Class A - similar but complementary workloads ii. Class B - similar but steady vs intermittent workloads iii. Class C - Large vs small workloads iv. Class D - Large vs noise-like workloads This new scheduler achieves short-term fairness for small timescale demanding rapid response to varying workloads and usage profiles. Rawlsian Fair scheduler is shown to consistently benefit workload Classes C and D while it only benefits Classes A and B workloads where they become disproportionate as the number of users increases. A simulation framework, dSim, simulates the new Rawlsian Fair scheduling mechanism. The dSim helps achieve instantaneous fairness in High Performance Computing environments, effective utilization of computing resources, and user satisfaction through the Rawlsian Fair scheduler.
The solitaire game "The Tower of Hanoi" was invented in the 19th century by the French number theorist Edouard Lucas. The book presents its mathematical theory and offers a survey of the historical development from predecessors up to recent research. In addition to long-standing myths, it provides a detailed overview of the essential mathematical facts with complete proofs, and also includes unpublished material, e.g., on some captivating integer sequences. The main objects of research today are the so-called Hanoi graphs and the related Sierpinski graphs. Acknowledging the great popularity of the topic in computer science, algorithms, together with their correctness proofs, form an essential part of the book. In view of the most important practical applications, namely in physics, network theory and cognitive (neuro)psychology, the book also addresses other structures related to the Tower of Hanoi and its variants. The updated second edition includes, for the first time in English, the breakthrough reached with the solution of the "The Reve's Puzzle" in 2014. This is a special case of the famed Frame-Stewart conjecture which is still open after more than 75 years. Enriched with elaborate illustrations, connections to other puzzles and challenges for the reader in the form of (solved) exercises as well as problems for further exploration, this book is enjoyable reading for students, educators, game enthusiasts and researchers alike. Excerpts from reviews of the first edition: "The book is an unusual, but very welcome, form of mathematical writing: recreational mathematics taken seriously and serious mathematics treated historically. I don't hesitate to recommend this book to students, professional research mathematicians, teachers, and to readers of popular mathematics who enjoy more technical expository detail." Chris Sangwin, The Mathematical Intelligencer 37(4) (2015) 87f. "The book demonstrates that the Tower of Hanoi has a very rich mathematical structure, and as soon as we tweak the parameters we surprisingly quickly find ourselves in the realm of open problems." Laszlo Kozma, ACM SIGACT News 45(3) (2014) 34ff. "Each time I open the book I discover a renewed interest in the Tower of Hanoi. I am sure that this will be the case for all readers." Jean-Paul Allouche, Newsletter of the European Mathematical Society 93 (2014) 56.
The information infrastructure - comprising computers, embedded devices, networks and software systems - is vital to operations in every sector: inf- mation technology, telecommunications, energy, banking and ?nance, tra- portation systems, chemicals, agriculture and food, defense industrial base, public health and health care, national monuments and icons, drinking water and water treatment systems, commercial facilities, dams, emergency services, commercial nuclear reactors, materials and waste, postal and shipping, and government facilities. Global business and industry, governments, indeed - ciety itself, cannot function if major components of the critical information infrastructure are degraded, disabled or destroyed. This book, Critical Infrastructure Protection IV, is the fourth volume in the annual series produced by IFIP Working Group 11.10 on Critical Infr- tructure Protection, an active international community of scientists, engineers, practitioners and policy makers dedicated to advancing research, development and implementation e?orts related to critical infrastructure protection. The book presents original research results and innovative applications in the area of infrastructure protection. Also, it highlights the importance of weaving s- ence, technology and policy in crafting sophisticated, yet practical, solutions that will help secure information, computer and network assets in the various critical infrastructure sectors. This volume contains seventeen edited papers from the Fourth Annual IFIP Working Group 11.10 International Conference on Critical Infrastructure P- tection, held at the National Defense University, Washington, DC, March 15- 17, 2010. The papers were refereed by members of IFIP Working Group 11.10 and other internationally-recognized experts in critical infrastructure prot- tion.
Presenting a novel biomimetic design method for transferring design solutions from nature to technology, this book focuses on structure-function patterns in nature and advanced modeling tools derived from TRIZ, the theory of inventive problem-solving. The book includes an extensive literature review on biomimicry as an engine of both innovation and sustainability, and discusses in detail the biomimetic design process, current biomimetic design methods and tools. The structural biomimetic design method for innovation and sustainability put forward in this text encompasses (1) the research method and rationale used to develop and validate this new design method; (2) the suggested design algorithm and tools including the Find structure database, structure-function patterns and ideality patterns; and (3) analyses of four case studies describing how to use the proposed method. This book offers an essential resource for designers who wish to use nature as a source of inspiration and knowledge, innovators and sustainability experts, and scientists and researchers, amongst others.
This book provides in-depth and wide-ranging analyses of the emergence, and subsequent ubiquity, of algorithms in diverse realms of social life. The plurality of Algorithmic Cultures emphasizes: 1) algorithms' increasing importance in the formation of new epistemic and organizational paradigms; and 2) the multifaceted analyses of algorithms across an increasing number of research fields. The authors in this volume address the complex interrelations between social groups and algorithms in the construction of meaning and social interaction. The contributors highlight the performative dimensions of algorithms by exposing the dynamic processes through which algorithms - themselves the product of a specific approach to the world - frame reality, while at the same time organizing how people think about society. With contributions from leading experts from Media Studies, Social Studies of Science and Technology, Cultural and Media Sociology from Canada, France, Germany, UK and the USA, this volume presents cutting edge empirical and conceptual research that includes case studies on social media platforms, gaming, financial trading and mobile security infrastructures.
Computer science is the science of the future, and already underlies every facet of business and technology, and much of our everyday lives. In addition, it will play a crucial role in the science the 21st century, which will be dominated by biology and biochemistry, similar to the role of mathematics in the physical sciences of the 20th century. In this award-winning best-seller, the author and his co-author focus on the fundamentals of computer science, which revolve around the notion of the "algorithm." They discuss the design of algorithms, and their efficiency and correctness, the inherent limitations of algorithms and computation, quantum algorithms, concurrency, large systems and artificial intelligence. Throughout, the authors, in their own words, stress the 'fundamental and robust nature of the science in a form that is virtually independent of the details of specific computers, languages and formalisms'. This version of the book is published to celebrate 25 years since its first edition, and in honor of the Alan M. Turing Centennial year. Turing was a true pioneer of computer science, whose work forms the underlying basis of much of this book. "
Provides complete update and organization of the previous books, with some material moving online; Includes new problems, projects, and exercises; Includes interactive coding resources to accompany the book, including examples in the text, exercises, projects, and refection questions.
This book presents a comprehensive study of different tools and techniques available to perform network forensics. Also, various aspects of network forensics are reviewed as well as related technologies and their limitations. This helps security practitioners and researchers in better understanding of the problem, current solution space, and future research scope to detect and investigate various network intrusions against such attacks efficiently. Forensic computing is rapidly gaining importance since the amount of crime involving digital systems is steadily increasing. Furthermore, the area is still underdeveloped and poses many technical and legal challenges. The rapid development of the Internet over the past decade appeared to have facilitated an increase in the incidents of online attacks. There are many reasons which are motivating the attackers to be fearless in carrying out the attacks. For example, the speed with which an attack can be carried out, the anonymity provided by the medium, nature of medium where digital information is stolen without actually removing it, increased availability of potential victims and the global impact of the attacks are some of the aspects. Forensic analysis is performed at two different levels: Computer Forensics and Network Forensics. Computer forensics deals with the collection and analysis of data from computer systems, networks, communication streams and storage media in a manner admissible in a court of law. Network forensics deals with the capture, recording or analysis of network events in order to discover evidential information about the source of security attacks in a court of law. Network forensics is not another term for network security. It is an extended phase of network security as the data for forensic analysis are collected from security products like firewalls and intrusion detection systems. The results of this data analysis are utilized for investigating the attacks. Network forensics generally refers to the collection and analysis of network data such as network traffic, firewall logs, IDS logs, etc. Technically, it is a member of the already-existing and expanding the field of digital forensics. Analogously, network forensics is defined as "The use of scientifically proved techniques to collect, fuses, identifies, examine, correlate, analyze, and document digital evidence from multiple, actively processing and transmitting digital sources for the purpose of uncovering facts related to the planned intent, or measured success of unauthorized activities meant to disrupt, corrupt, and or compromise system components as well as providing information to assist in response to or recovery from these activities." Network forensics plays a significant role in the security of today's organizations. On the one hand, it helps to learn the details of external attacks ensuring similar future attacks are thwarted. Additionally, network forensics is essential for investigating insiders' abuses that constitute the second costliest type of attack within organizations. Finally, law enforcement requires network forensics for crimes in which a computer or digital system is either being the target of a crime or being used as a tool in carrying a crime. Network security protects the system against attack while network forensics focuses on recording evidence of the attack. Network security products are generalized and look for possible harmful behaviors. This monitoring is a continuous process and is performed all through the day. However, network forensics involves post mortem investigation of the attack and is initiated after crime notification. There are many tools which assist in capturing data transferred over the networks so that an attack or the malicious intent of the intrusions may be investigated. Similarly, various network forensic frameworks are proposed in the literature.
Classifies the optimization problems of the ports into five scheduling decisions. For each decision, it supplies an overview, formulates each of the decisions as constraint satisfaction and optimization problems, and then covers possible solutions, implementation, and performance. Part One explores the various optimization problems in modern container terminals, while Part Two details advanced algorithms for the minimum cost flow (MCF) problem and for the scheduling problem of AGVs in ports. A complete package that can help readers address the scheduling problems of AGVs in ports.
Introduction to Quantum Natural Language Processing. Overview of Leadership and AI. The Age of Quantum Superiority. Challenges To Today's Leadership. AI-induced Strategic Implementation and Organizational Performance.
This book discusses an important area of numerical optimization, called interior-point method. This topic has been popular since the 1980s when people gradually realized that all simplex algorithms were not convergent in polynomial time and many interior-point algorithms could be proved to converge in polynomial time. However, for a long time, there was a noticeable gap between theoretical polynomial bounds of the interior-point algorithms and efficiency of these algorithms. Strategies that were important to the computational efficiency became barriers in the proof of good polynomial bounds. The more the strategies were used in algorithms, the worse the polynomial bounds became. To further exacerbate the problem, Mehrotra's predictor-corrector (MPC) algorithm (the most popular and efficient interior-point algorithm until recently) uses all good strategies and fails to prove the convergence. Therefore, MPC does not have polynomiality, a critical issue with the simplex method. This book discusses recent developments that resolves the dilemma. It has three major parts. The first, including Chapters 1, 2, 3, and 4, presents some of the most important algorithms during the development of the interior-point method around the 1990s, most of them are widely known. The main purpose of this part is to explain the dilemma described above by analyzing these algorithms' polynomial bounds and summarizing the computational experience associated with them. The second part, including Chapters 5, 6, 7, and 8, describes how to solve the dilemma step-by-step using arc-search techniques. At the end of this part, a very efficient algorithm with the lowest polynomial bound is presented. The last part, including Chapters 9, 10, 11, and 12, extends arc-search techniques to some more general problems, such as convex quadratic programming, linear complementarity problem, and semi-definite programming.
Data has emerged as a key component that determines how interactions across the world are structured, mediated and represented. This book examines these new data publics and the areas in which they become operative, via analysis of politics, geographies, environments and social media platforms. By claiming to offer a mechanism to translate every conceivable occurrence into an abstract code that can be endlessly manipulated, digitally processed data has caused conventional reference systems which hinge on our ability to mark points of origin, to rapidly implode. Authors from a range of disciplines provide insights into such a political economy of data capitalism; the political possibilities of techno-logics beyond data appropriation and data refusal; questions of visual, spatial and geographical organization; emergent ways of life and the environments that sustain them; and the current challenges of data publics, which is explored via case studies of three of the most influential platforms in the social media economy today: Facebook, Instagram and Whatsapp. Data Publics will be of great interest to academics and students in the fields of computer science, philosophy, sociology, media and communication studies, architecture, visual culture, art and design, and urban and cultural studies.
This book describes the essential components of the SCION secure Internet architecture, the first architecture designed foremost for strong security and high availability. Among its core features, SCION also provides route control, explicit trust information, multipath communication, scalable quality-of-service guarantees, and efficient forwarding. The book includes functional specifications of the network elements, communication protocols among these elements, data structures, and configuration files. In particular, the book offers a specification of a working prototype. The authors provide a comprehensive description of the main design features for achieving a secure Internet architecture. They facilitate the reader throughout, structuring the book so that the technical detail gradually increases, and supporting the text with a glossary, an index, a list of abbreviations, answers to frequently asked questions, and special highlighting for examples and for sections that explain important research, engineering, and deployment features. The book is suitable for researchers, practitioners, and graduate students who are interested in network security.
Anyone browsing at the stationery store will see an incredible array of pop-up cards available for any occasion. The workings of pop-up cards and pop-up books can be remarkably intricate. Behind such designs lies beautiful geometry involving the intersection of circles, cones, and spheres, the movements of linkages, and other constructions. The geometry can be modelled by algebraic equations, whose solutions explain the dynamics. For example, several pop-up motions rely on the intersection of three spheres, a computation made every second for GPS location. Connecting the motions of the card structures with the algebra and geometry reveals abstract mathematics performing tangible calculations. Beginning with the nephroid in the 19th-century, the mathematics of pop-up design is now at the frontiers of rigid origami and algorithmic computational complexity. All topics are accessible to those familiar with high-school mathematics; no calculus required. Explanations are supplemented by 140+ figures and 20 animations.
Features contributions from thought leaders across academia, industry, and government Focuses on novel algorithms and practical applications
The vast circulations of mobile devices, sensors and data mean that the social world is now defined by a complex interweaving of human and machine agency. Key to this is the growing power of algorithms - the decision-making parts of code - in our software dense and data rich environments. Algorithms can shape how we are retreated, what we know, who we connect with and what we encounter, and they present us with some important questions about how society operates and how we understand it. This book offers a series of concepts, approaches and ideas for understanding the relations between algorithms and power. Each chapter provides a unique perspective on the integration of algorithms into the social world. As such, this book directly tackles some of the most important questions facing the social sciences today. This book was originally published as a special issue of Information, Communication & Society.
Though the reductionist approachto biology and medicine has led to several imp- tant advances, further progresses with respect to the remaining challenges require integration of representation, characterization and modeling of the studied systems along a wide range of spatial and time scales. Such an approach, intrinsically - lated to systems biology, is poised to ultimately turning biology into a more precise and synthetic discipline, paving the way to extensive preventive and regenerative medicine [1], drug discovery [20] and treatment optimization [24]. A particularly appealing and effective approach to addressing the complexity of interactions inherent to the biological systems is provided by the new area of c- plex networks [34, 30, 8, 13, 12]. Basically, it is an extension of graph theory [10], focusing on the modeling, representation, characterization, analysis and simulation ofcomplexsystemsbyconsideringmanyelementsandtheirinterconnections.C- plex networks concepts and methods have been used to study disease [17], tr- scription networks [5, 6, 4], protein-protein networks [22, 36, 16, 39], metabolic networks [23] and anatomy [40].
- New advancements of fractal analysis with applications to many scientific, engineering, and societal issues - Recent changes and challenges of fractal geometry with the rapid advancement of technology - Attracted chapters on novel theory and recent applications of fractals. - Offers recent findings, modelling and simulations of fractal analysis from eminent institutions across the world - Analytical innovations of fractal analysis - Edited collection with a variety of viewpoints
We live in an algorithmic society. Algorithms have become the main mediator through which power is enacted in our society. This book brings together three academic fields - Public Administration, Criminal Justice and Urban Governance - into a single conceptual framework, and offers a broad cultural-political analysis, addressing critical and ethical issues of algorithms. Governments are increasingly turning towards algorithms to predict criminality, deliver public services, allocate resources, and calculate recidivism rates. Mind-boggling amounts of data regarding our daily actions are analysed to make decisions that manage, control, and nudge our behaviour in everyday life. The contributions in this book offer a broad analysis of the mechanisms and social implications of algorithmic governance. Reporting from the cutting edge of scientific research, the result is illuminating and useful for understanding the relations between algorithms and power.Topics covered include: Algorithmic governmentality Transparency and accountability Fairness in criminal justice and predictive policing Principles of good digital administration Artificial Intelligence (AI) in the smart city This book is essential reading for students and scholars of Sociology, Criminology, Public Administration, Political Sciences, and Cultural Theory interested in the integration of algorithms into the governance of society. |
You may like...
Concept Parsing Algorithms (CPA) for…
Uri Shafrir, Masha Etkind
Hardcover
R3,428
Discovery Miles 34 280
Data Abstraction and Problem Solving…
Janet Prichard, Frank Carrano
Paperback
R2,163
Discovery Miles 21 630
Principles of Radio Navigation for…
Sauta O.I., Shatrakov A.Y., …
Hardcover
R2,789
Discovery Miles 27 890
|