![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Applications of computing
The process of learning words and languages may seem like an instinctual trait, inherent to nearly all humans from a young age. However, a vast range of complex research and information exists in detailing the complexities of the process of word learning. Theoretical and Computational Models of Word Learning: Trends in Psychology and Artificial Intelligence strives to combine cross-disciplinary research into one comprehensive volume to help readers gain a fuller understanding of the developmental processes and influences that makeup the progression of word learning. Blending together developmental psychology and artificial intelligence, this publication is intended for researchers, practitioners, and educators who are interested in language learning and its development as well as computational models formed from these specific areas of research.
Research and development surrounding the use of data queries is receiving increased attention from computer scientists and data specialists alike. Through the use of query technology, large volumes of data in databases can be retrieved, and information systems built based on databases can support problem solving and decision making across industries. The Handbook of Research on Innovative Database Query Processing Techniques focuses on the growing topic of database query processing methods, technologies, and applications. Aimed at providing an all-inclusive reference source of technologies and practices in advanced database query systems, this book investigates various techniques, including database and XML queries, spatiotemporal data queries, big data queries, metadata queries, and applications of database query systems. This comprehensive handbook is a necessary resource for students, IT professionals, data analysts, and academicians interested in uncovering the latest methods for using queries as a means to extract information from databases. This all-inclusive handbook includes the latest research on topics pertaining to information retrieval, data extraction, data management, design and development of database queries, and database and XM queries.
Electronic discovery refers to a process in which electronic data
is sought, located, secured, and searched with the intent of using
it as evidence in a legal case. Computer forensics is the
application of computer investigation and analysis techniques to
perform an investigation to find out exactly what happened on a
computer and who was responsible. IDC estimates that the U.S.
market for computer forensics will be grow from $252 million in
2004 to $630 million by 2009. Business is strong outside the United
States, as well. By 2011, the estimated international market will
be $1.8 billion dollars. The Techno Forensics Conference has
increased in size by almost 50% in its second year; another example
of the rapid growth in the market.
At one time, the office was a physical place, and employees congregated in the same location to work together on projects. The advent of the internet and the world wide web, however, not only made the unthinkable possible, it forever changed the way persons view both the office and work. ""Handbook of Research on Virtual Workplaces and the New Nature of Business Practices"" compiles authoritative research from XX scholars from over XX countries, covering the issues surrounding the influx of information technology to the office environment, from choice and effective use of technologies to necessary participants in the virtual workplace.
In recent years, innovative technologies have lead to rapid progression and accelerated research studies within the field of end-user computing. ""Computational Advancements in End-User Technologies: Emerging Models and Frameworks"" contains leading research and practices into the advancement, significance, and comprehensive nature of end-user computing. A defining collection of significant tools, applications, and methodologies within this expanding field of study, this publication provides academicians, researchers, and practitioners with a complete and practical resource of expert international findings.
It is clear that the digital age has fully embraced music production, distribution, and transcendence for a vivid audience that demands more music both in quantity and versatility. However, the evolving world of digital music production faces a calamity of tremendous proportions: the asymmetrically increasing online piracy that devastates radio stations, media channels, producers, composers, and artists, severely threatening the music industry. Digital Tools for Computer Music Production and Distribution presents research-based perspectives and solutions for integrating computational methods for music production, distribution, and access around the world, in addition to challenges facing the music industry in an age of digital access, content sharing, and crime. Highlighting the changing scope of the music industry and the role of the digital age in such transformations, this publication is an essential resource for computer programmers, sound engineers, language and speech experts, legal experts specializing in music piracy and rights management, researchers, and graduate-level students across disciplines.
There are new and important advancements in todays complexity theories in ICT and requires an extraordinary perspective on the interaction between living systems and information technologies. With human evolution and its continuous link with the development of new tools and environmental changes, technological advancements are paving the way for new evolutionary steps. Complexity Science, Living Systems, and Reflexing Interfaces: New Models and Perspectives is a collection of research provided by academics and scholars aiming to introduce important advancements in areas such as artificial intelligence, evolutionary computation, neural networks, and much more. This scholarly piece will provide contributions that will define the line of development in complexity science.
System administration is about the design, running and maintenance
of human-computer systems. Examples of human-computer systems
include business enterprises, service institutions and any
extensive machinery that is operated by, or interacts with human
beings. System administration is often thought of as the
technological side of a system: the architecture, construction and
optimization of the collaborating parts, but it also occasionally
touches on softer factors such as user assistance (help desks),
ethical considerations in deploying a system, and the larger
implications of its design for others who come into contact with
it.
Big data consists of data sets that are too large and complex for traditional data processing and data management applications. Therefore, to obtain the valuable information within the data, one must use a variety of innovative analytical methods, such as web analytics, machine learning, and network analytics. As the study of big data becomes more popular, there is an urgent demand for studies on high-level computational intelligence and computing services for analyzing this significant area of information science. Big Data Analytics for Sustainable Computing is a collection of innovative research that focuses on new computing and system development issues in emerging sustainable applications. Featuring coverage on a wide range of topics such as data filtering, knowledge engineering, and cognitive analytics, this publication is ideally designed for data scientists, IT specialists, computer science practitioners, computer engineers, academicians, professionals, and students seeking current research on emerging analytical techniques and data processing software.
An all-star cast of authors analyze the top IT security threats for
2008 as selected by the editors and readers of Infosecurity
Magazine. This book, compiled from the Syngress Security Library,
is an essential reference for any IT professional managing
enterprise security. It serves as an early warning system, allowing
readers to assess vulnerabilities, design protection schemes and
plan for disaster recovery should an attack occur. Topics include
Botnets, Cross Site Scripting Attacks, Social Engineering, Physical
and Logical Convergence, Payment Card Industry (PCI) Data Security
Standards (DSS), Voice over IP (VoIP), and Asterisk Hacking.
In recent years, mobile technology and the internet of objects have been used in mobile networks to meet new technical demands. Emerging needs have centered on data storage, computation, and low latency management in potentially smart cities, transport, smart grids, and a wide number of sustainable environments. Federated learning's contributions include an effective framework to improve network security in heterogeneous industrial internet of things (IIoT) environments. Demystifying Federated Learning for Blockchain and Industrial Internet of Things rediscovers, redefines, and reestablishes the most recent applications of federated learning using blockchain and IIoT to optimize data for next-generation networks. It provides insights to readers in a way of inculcating the theme that shapes the next generation of secure communication. Covering topics such as smart agriculture, object identification, and educational big data, this premier reference source is an essential resource for computer scientists, programmers, government officials, business leaders and managers, students and faculty of higher education, researchers, and academicians.
Book DescriptionHow will AI evolve and what major innovations are on the horizon? What will its impact be on the job market, economy, and society? What is the path toward human-level machine intelligence? What should we be concerned about as artificial intelligence advances? Architects of Intelligence contains a series of in-depth, one-to-one interviews where New York Times bestselling author, Martin Ford, uncovers the truth behind these questions from some of the brightest minds in the Artificial Intelligence community. Martin has wide-ranging conversations with twenty-three of the world's foremost researchers and entrepreneurs working in AI and robotics: Demis Hassabis (DeepMind), Ray Kurzweil (Google), Geoffrey Hinton (Univ. of Toronto and Google), Rodney Brooks (Rethink Robotics), Yann LeCun (Facebook) , Fei-Fei Li (Stanford and Google), Yoshua Bengio (Univ. of Montreal), Andrew Ng (AI Fund), Daphne Koller (Stanford), Stuart Russell (UC Berkeley), Nick Bostrom (Univ. of Oxford), Barbara Grosz (Harvard), David Ferrucci (Elemental Cognition), James Manyika (McKinsey), Judea Pearl (UCLA), Josh Tenenbaum (MIT), Rana el Kaliouby (Affectiva), Daniela Rus (MIT), Jeff Dean (Google), Cynthia Breazeal (MIT), Oren Etzioni (Allen Institute for AI), Gary Marcus (NYU), and Bryan Johnson (Kernel). Martin Ford is a prominent futurist, and author of Financial Times Business Book of the Year, Rise of the Robots. He speaks at conferences and companies around the world on what AI and automation might mean for the future.
The field of data mining is receiving significant attention in today's information-rich society, where data is available from different sources and formats, in large volumes, and no longer constitutes a bottleneck for knowledge acquisition. This rich information has paved the way for novel areas of research, particularly in the crime data analysis realm. Data Mining Trends and Applications in Criminal Science and Investigations presents scientific concepts and frameworks of data mining and analytics implementation and uses across various domains, such as public safety, criminal investigations, intrusion detection, crime scene analysis, and suspect modeling. Exploring the diverse ways that data is revolutionizing the field of criminal science, this publication meets the research needs of law enforcement professionals, data analysts, investigators, researchers, and graduate-level students.
Cluster or co-cluster analyses are important tools in a variety of scientific areas. The introduction of this book presents a state of the art of already well-established, as well as more recent methods of co-clustering. The authors mainly deal with the two-mode partitioning under different approaches, but pay particular attention to a probabilistic approach. Chapter 1 concerns clustering in general and the model-based clustering in particular. The authors briefly review the classical clustering methods and focus on the mixture model. They present and discuss the use of different mixtures adapted to different types of data. The algorithms used are described and related works with different classical methods are presented and commented upon. This chapter is useful in tackling the problem of co-clustering under the mixture approach. Chapter 2 is devoted to the latent block model proposed in the mixture approach context. The authors discuss this model in detail and present its interest regarding co-clustering. Various algorithms are presented in a general context. Chapter 3 focuses on binary and categorical data. It presents, in detail, the appropriated latent block mixture models. Variants of these models and algorithms are presented and illustrated using examples. Chapter 4 focuses on contingency data. Mutual information, phi-squared and model-based co-clustering are studied. Models, algorithms and connections among different approaches are described and illustrated. Chapter 5 presents the case of continuous data. In the same way, the different approaches used in the previous chapters are extended to this situation. Contents 1. Cluster Analysis. 2. Model-Based Co-Clustering. 3. Co-Clustering of Binary and Categorical Data. 4. Co-Clustering of Contingency Tables. 5. Co-Clustering of Continuous Data. About the Authors Gerard Govaert is Professor at the University of Technology of Compiegne, France. He is also a member of the CNRS Laboratory Heudiasyc (Heuristic and diagnostic of complex systems). His research interests include latent structure modeling, model selection, model-based cluster analysis, block clustering and statistical pattern recognition. He is one of the authors of the MIXMOD (MIXtureMODelling) software. Mohamed Nadif is Professor at the University of Paris-Descartes, France, where he is a member of LIPADE (Paris Descartes computer science laboratory) in the Mathematics and Computer Science department. His research interests include machine learning, data mining, model-based cluster analysis, co-clustering, factorization and data analysis. Cluster Analysis is an important tool in a variety of scientific areas. Chapter 1 briefly presents a state of the art of already well-established as well more recent methods. The hierarchical, partitioning and fuzzy approaches will be discussed amongst others. The authors review the difficulty of these classical methods in tackling the high dimensionality, sparsity and scalability. Chapter 2 discusses the interests of coclustering, presenting different approaches and defining a co-cluster. The authors focus on co-clustering as a simultaneous clustering and discuss the cases of binary, continuous and co-occurrence data. The criteria and algorithms are described and illustrated on simulated and real data. Chapter 3 considers co-clustering as a model-based co-clustering. A latent block model is defined for different kinds of data. The estimation of parameters and co-clustering is tackled under two approaches: maximum likelihood and classification maximum likelihood. Hard and soft algorithms are described and applied on simulated and real data. Chapter 4 considers co-clustering as a matrix approximation. The trifactorization approach is considered and algorithms based on update rules are described. Links with numerical and probabilistic approaches are established. A combination of algorithms are proposed and evaluated on simulated and real data. Chapter 5 considers a co-clustering or bi-clustering as the search for coherent co-clusters in biological terms or the extraction of co-clusters under conditions. Classical algorithms will be described and evaluated on simulated and real data. Different indices to evaluate the quality of coclusters are noted and used in numerical experiments.
Developed by Professionals and Experienced Teachers from top schools across the country, the book has been divided into five sections namely Word Structure and Knowledge, Reading Comprehension, Spoken and Written Expressions, Achievers section, Model Papers. The concepts have been explained in brief through solved examples and Illustrations. To enhance the problem solving skills of candidates, Multiple Choice Questions (MCQs) with detailed solutions have been provided at the end of each chapter. Two Mock Test Papers have been provided for practice purpose. A CD containing Study Chart for systematic preparation, Tips & Tricks to crack English Olympiad, Pattern of exam, and links of Previous Years Papers is accompanied with this book. The book is recommended for various school level and competitive exams. #v&spublishers
Human, Social, and Organizational Aspects of Health Information Systems offers an evidence-based management approach to issues associated with the human and social aspects of designing, developing, implementing, and maintaining health information systems across a healthcare organization - specific to an individual, team, organizational, system, and international perspective. Integrating knowledge from multiple levels, this book will benefit scholars and practitioners from the medical information, health service management, information technology arenas.
INTELLECTUAL TECHNOLOGIES SET Coordinated by Jean-Max Noyer and Maryse Carmes The dynamics of production, circulation and dissemination of knowledge that are currently developing in the digital ecosystem testify to a profound change in capitalism. On the margins of the traditional duo of knowledge markets and exclusive property rights, the emerging notion of cultural commons is opening the door to new modes of production based on hybrid market arrangements and an inclusive understanding of property. This book studies the political economy of cultural commons in the digital ecosystem, outlining the contexts and areas of thought in which this concept has emerged and identifying the socio-economic, technical and political issues associated with it. It also analyzes the specific physical conditions that enable the implementation of the economy of cultural commons in a specific digital ecosystem, that of books, by studying the effects of digital libraries and self-publishing platforms.
A fascinating work on the history and development of cryptography, from the Egyptians to WWII. Many of the earliest books, particularly those dating back to the 1900s and before, are now extremely scarce and increasingly expensive. Hesperides Press are republishing these classic works in affordable, high quality, modern editions, using the original text and artwork Contents Include The Beginings of Cryptography From the Middle Ages Onwards Signals, Signs, And Secret Languages Commercial Codes Military Codes and Ciphers Types of Codes and Ciphers Methods of Deciphering Bibliography
Artificial Intelligence Medicine: Technical Basis and Clinical Applications presents a comprehensive overview of the field, ranging from its history and technical foundations, to specific clinical applications and finally to prospects. Artificial Intelligence (AI) is expanding across all domains at a breakneck speed. Medicine, with the availability of large multidimensional datasets, lends itself to strong potential advancement with the appropriate harnessing of AI. The integration of AI can occur throughout the continuum of medicine: from basic laboratory discovery to clinical application and healthcare delivery. Integrating AI within medicine has been met with both excitement and scepticism. By understanding how AI works, and developing an appreciation for both limitations and strengths, clinicians can harness its computational power to streamline workflow and improve patient care. It also provides the opportunity to improve upon research methodologies beyond what is currently available using traditional statistical approaches. On the other hand, computers scientists and data analysts can provide solutions, but often lack easy access to clinical insight that may help focus their efforts. This book provides vital background knowledge to help bring these two groups together, and to engage in more streamlined dialogue to yield productive collaborative solutions in the field of medicine. |
You may like...
Discovering Computers - Digital…
Misty Vermaat, Mark Ciampa, …
Paperback
Management Of Information Security
Michael Whitman, Herbert Mattord
Paperback
Discovering Computers 2018 - Digital…
Misty Vermaat, Steven Freund, …
Paperback
R1,136
Discovery Miles 11 360
|