![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Applications of computing > Computer modelling & simulation
This book describes issues in modeling unconventional conflict and suggests a new way to do the modeling. It presents an ontology that describes the unconventional conflict domain, which allows for greater ease in modeling unconventional conflict. Supporting holistic modeling, which means that we can see the entire picture of what needs to be modeled, the ontology allows us to make informed decisions about what to model and what to omit. The unconventional conflict ontology also separates the things we understand best from the things we understand least. This separation means that we can perform verification, validation and accreditation (VV&A) more efficiently and can describe the competence of the model more accurately. However, before this message can be presented in its entirety the supporting body of knowledge has to be explored. For this reason, the book offers chapters that focus on the description of unconventional conflict and the analyses that have been performed, modeling, with a concentration on past efforts at modeling unconventional conflict, the precursors to the ontology, and VV&A. Unconventional conflict is a complex, messy thing. It normally involves multiple actors, with their own conflicting agendas and differing concepts of legitimate actions. This book will present a useful introduction for researchers and professionals within the field.
The use of simulation modeling and analysis is becoming increasingly more popular as a technique for improving or investigating process performance. This book is a practical, easy-to-follow reference that offers up-to-date information and step-by-step procedures for conducting simulation studies. It provides sample simulation project support material, including checklists, data-collection forms, and sample simulation project reports and publications to facilitate practitioners' efforts in conducting simulation modeling and analysis projects. Simulation Modeling Handbook: A Practical Approach has two major advantages over other treatments. First, it is independent of any particular simulation software, allowing readers to use any commercial package or programming language. Second, it was written to insulate practitioners from unnecessary simulation theory that does not focus on their average, practical needs. As the popularity of simulation studies continues to grow, the planning and execution of these projects, more and more engineering and management professionals will be called upon to perform these tasks. With its simple, no-nonsense approach and focus on application rather than theory, this comprehensive and easy-to-understand guide is the ideal vehicle for acquiring the background and skills needed to undertake effective simulation projects. Features Presents step-by-step procedures for conducting successful simulation modeling and analysis Addresses every phase of performing simulations, from formulating the problem to presenting study results and recommendations Uses approaches applicable regardless of the specific simulation or software used Includes a summary of the major simulation software packages and discusses the pros and cons of using general purpose programming languages
This book offers a comprehensive overview of cutting-edge approaches for decision-making in hierarchical organizations. It presents soft-computing-based techniques, including fuzzy sets, neural networks, genetic algorithms and particle swarm optimization, and shows how these approaches can be effectively used to deal with problems typical of this kind of organization. After introducing the main classical approaches applied to multiple-level programming, the book describes a set of soft-computing techniques, demonstrating their advantages in providing more efficient solutions to hierarchical decision-making problems compared to the classical methods. Based on the book Fuzzy and Multi-Level Decision Making (Springer, 2001) by Lee E.S and Shih, H., this second edition has been expanded to include the most recent findings and methods and a broader spectrum of soft computing approaches. All the algorithms are presented in detail, together with a wealth of practical examples and solutions to real-world problems, providing students, researchers and professionals with a timely, practice-oriented reference guide to the area of interactive fuzzy decision making, multi-level programming and hierarchical optimization.
The understanding and control of transport phenomena in materials processing play an important role in the improvement of conventional processes and in the development of new techniques. Computer modeling of these phenomena can be used effectively for this purpose. Although there are several books in the literature covering the analysis of heat transfer and fluid flow, Computer Modelling of Heat and Fluid Flow in Materials Processing specifically addresses the understanding of these phenomena in materials processing situations. Written at a level suitable for graduate students in materials science and engineering and subjects, this book is ideal for those wishing to learn how to approach computer modeling of transport phenomena and apply these techniques in materials processing. The text includes a number of relevant case studies and each chapter is supported by numerous examples of transport modeling programs.
'Points, questions, stories, and occasional rants introduce the 24 chapters of this engaging volume. With a focus on mathematics and peppered with a scattering of computer science settings, the entries range from lightly humorous to curiously thought-provoking. Each chapter includes sections and sub-sections that illustrate and supplement the point at hand. Most topics are self-contained within each chapter, and a solid high school mathematics background is all that is needed to enjoy the discussions. There certainly is much to enjoy here.'CHOICEEver notice how people sometimes use math words inaccurately? Or how sometimes you instinctively know a math statement is false (or not known)?Each chapter of this book makes a point like those above and then illustrates the point by doing some real mathematics through step-by-step mathematical techniques.This book gives readers valuable information about how mathematics and theoretical computer science work, while teaching them some actual mathematics and computer science through examples and exercises. Much of the mathematics could be understood by a bright high school student. The points made can be understood by anyone with an interest in math, from the bright high school student to a Field's medal winner.
This book explores four guiding themes - reduced order modelling, high dimensional problems, efficient algorithms, and applications - by reviewing recent algorithmic and mathematical advances and the development of new research directions for uncertainty quantification in the context of partial differential equations with random inputs. Highlighting the most promising approaches for (near-) future improvements in the way uncertainty quantification problems in the partial differential equation setting are solved, and gathering contributions by leading international experts, the book's content will impact the scientific, engineering, financial, economic, environmental, social, and commercial sectors.
Lankhorst and his co-authors present ArchiMate (R) 3.0, enterprise modelling language that captures the complexity of architectural domains and their relations and allows the construction of integrated enterprise architecture models. They provide architects with concrete instruments that improve their architectural practice. As this is not enough, they additionally present techniques and heuristics for communicating with all relevant stakeholders about these architectures. Since an architecture model is useful not only for providing insight into the current or future situation but can also be used to evaluate the transition from 'as-is' to 'to-be', the authors also describe analysis methods for assessing both the qualitative impact of changes to an architecture and the quantitative aspects of architectures, such as performance and cost issues. The modelling language presented has been proven in practice in many real-life case studies and has been adopted by The Open Group as an international standard. So this book is an ideal companion for enterprise IT or business architects in industry as well as for computer or management science students studying the field of enterprise architecture. This fourth edition of the book has been completely reworked to be compatible with ArchiMate (R) 3.0, and it includes a new chapter relating this new version to other standards. New sections on capability analysis, risk analysis, and business architecture in general have also been introduced.
Much has been written about Building Information Modelling (BIM) driving collaboration and innovation, but how will future quality managers and engineers develop digital capabilities in augmented and video realities, with business intelligence platforms, robots, new materials, artificial intelligence, blockchains, drones, laser scanning, data trusts, 3D printing and many other types of technological advances in construction? These emerging technologies are potential game changers that require new skills and processes. Digital Quality Management in Construction is the first 'how to' book on harnessing novel disruptive technology in construction quality management. The book takes a tour of the new technologies and relates them to the management of quality, but also sets out a road map to build on proven lean construction techniques and embed technologically based processes to raise quality professionals' digital capabilities. With the mountain of data being generated, quality managers need to unlock its value to drive the quality of construction in the twenty-first century, and this book will help them do that and allow those working in construction Quality Management to survive and thrive, creating higher quality levels and less waste. This book is essential reading for quality managers, project managers and all professionals in the Architecture, Engineering and Construction industry (AEC). Students interested in new and disruptive technologies will also learn a great deal from reading this book, written by a professional quality manager with nearly thirty years' experience in both the public and private sectors.
The Pacific Symposium on Biocomputing (PSB) 2019 is an international, multidisciplinary conference for the presentation and discussion of current research in the theory and application of computational methods in problems of biological significance. Presentations are rigorously peer reviewed and are published in an archival proceedings volume. PSB 2019 will be held on January 3 - 7, 2019 in Kohala Coast, Hawaii. Tutorials and workshops will be offered prior to the start of the conference.PSB 2019 will bring together top researchers from the US, the Asian Pacific nations, and around the world to exchange research results and address open issues in all aspects of computational biology. It is a forum for the presentation of work in databases, algorithms, interfaces, visualization, modeling, and other computational methods, as applied to biological problems, with emphasis on applications in data-rich areas of molecular biology.The PSB has been designed to be responsive to the need for critical mass in sub-disciplines within biocomputing. For that reason, it is the only meeting whose sessions are defined dynamically each year in response to specific proposals. PSB sessions are organized by leaders of research in biocomputing's 'hot topics.' In this way, the meeting provides an early forum for serious examination of emerging methods and approaches in this rapidly changing field.
Since the early 1980s, CAD frameworks have received a great deal of attention, both in the research community and in the commercial arena. It is generally agreed that CAD framework technology promises much: advanced CAD frameworks can turn collections of individual tools into effective and user-friendly design environments. But how can this promise be fulfilled? CAD Frameworks: Principles and Architecture describes the design and construction of CAD frameworks. It presents principles for building integrated design environments and shows how a CAD framework can be based on these principles. It derives the architecture of a CAD framework in a systematic way, using well-defined primitives for representation. This architecture defines how the many different framework sub-topics, ranging from concurrency control to design flow management, relate to each other and come together into an overall system. The origin of this work is the research and development performed in the context of the Nelsis CAD Framework, which has been a working system for well over eight years, gaining functionality while evolving from one release to the next. The principles and concepts presented in this book have been field-tested in the Nelsis CAD Framework. CAD Frameworks: Principles and Architecture is primarily intended for EDA professionals, both in industry and in academia, but is also valuable outside the domain of electronic design. Many of the principles and concepts presented are also applicable to other design-oriented application domains, such as mechanical design or computer-aided software engineering (CASE). It is thus a valuable reference for all those involved in computer-aided design.
In recent years the concept of energy has been revised and a new model based on the principle of sustainability has become more and more pervasive. The appraisal of energy technologies and projects is complex and uncertain as the related decision making has to encompass environmental, technical, economic and social factors and information sources. The scientific procedure of assessment has a vital role as it can supply the right tools to evaluate the actual situation and make realistic forecasts of the effects and outcomes of any actions undertaken. "Assessment and Simulation Tools for Sustainable Energy Systems" offers reviews of the main assessment and simulation methods used for effective energy assessment. Divided across three sections, "Assessment and Simulation Tools for Sustainable Energy Systems" develops the reader s ability to select suitable tools to support decision making and implementation of sustainable energy projects. The first is dedicated to the analysis of theoretical foundations and applications of multi-criteria decision making. This is followed by chapters concentrating on the theory and practice of fuzzy inference, neural nets and algorithms genetics. Finally, simulation methods such as Monte Carlo analysis, mathematical programming and others are detailed. This comprehensive illustration of these tools and their application makes "Assessment and Simulation Tools for Sustainable Energy Systems" a key guide for researchers, scientists, managers, politicians and industry professionals developing the field of sustainable energy systems. It may also prompt further advancements in soft computing and simulation issues for students and researchers."
From reviews of the series:
Prevention of Pressure Sores: Engineering and Clinical Aspects collects together material from throughout the literature. The book first discusses the causes of pressure sores and then describes warning signs and behavior to prevent the incidence of pressure sores. It also examines the numerous different devices used to alleviate and prevent pressure sores, including various types of seat cushions, hospital beds, complex pressure relief methods, wheelchair pressure reliefs, and other preventative methods. After comparing the accuracy of various methods of measuring pressure distributions using different types of sensors, the book discusses the treatment of pressure sores. It contains a large number of references, allowing readers to refer back to the important original work in the different fields of this subject.
For all introductory genetics courses. Focus on essential genetic topics and explore the latest breakthroughs Known for its focus on conceptual understanding, problem solving, and practical applications, the bestselling Essentials of Genetics strengthens problem-solving skills and explores the essential genetics topics that today's students need to understand. The 10th Edition has been extensively updated to provide comprehensive coverage of important, emerging topics such as CRISPR-Cas, epigenetics, and genetic testing. Additionally, a new Special Topic chapter covers Advances in Neurogenetics with a focus on Huntington Disease, and new essays on Genetics, Ethics, and Society emphasize ethical considerations that genetics is bringing into everyday life. The accompanying Mastering Genetics online platform includes new tutorials on topics such as CRISPR-Cas and epigenetics, and new Dynamic Study Modules, which support student learning of key concepts and prepare them for class. Also available as a Pearson eText or packaged with Mastering Genetics: Pearson eText is a simple-to-use, mobile-optimized, personalized reading experience that can be adopted on its own as the main course material. It lets students highlight, take notes, and review key vocabulary all in one place, even when offline. Seamlessly integrated videos and other rich media engage students and give them access to the help they need, when they need it. Educators can easily share their own notes with students so they see the connection between their eText and what they learn in class - motivating them to keep reading, and keep learning. If your instructor has assigned Pearson eText as your main course material, search for: 0135588847 / 9780135588840 Pearson eText Essentials of Genetics -- Access Card, 10/e OR 0135588782 / 9780135588789 Pearson eText Essentials of Genetics -- Instant Access, 10/e Also available with Mastering Genetics By combining trusted author content with digital tools and a flexible platform, Mastering personalizes the learning experience and improves results for each student.Mastering Genetics allows students to develop problem-solving skills, learn from tutorials on key genetics concepts, and gain a better understanding of emerging topics. If you would like to purchase both the physical text and Mastering Genetics, search for: 0135173604 / 9780135173602 Essentials of Genetics Plus Mastering Genetics -- Access Card Package Package consists of: 0134898419 / 9780134898414 Essentials of Genetics 0135188687 / 9780135188682 Mastering Genetics with Pearson eText -- ValuePack Access Card -- for Essentials of Genetics Note: You are purchasing a standalone book; Pearson eText and Mastering A&P do not come packaged with this content. Students, ask your instructor for the correct package ISBN and Course ID. Instructors, contact your Pearson representative for more information.
Concrete-filled stainless steel tubular (CFSST) columns are increasingly used in modern composite construction due to their high strength, high ductility, high corrosion resistance, high durability and aesthetics and ease of maintenance. Thin-walled CFSST columns are characterized by the different strain-hardening behavior of stainless steel in tension and in compression, local buckling of stainless steel tubes and concrete confinement. Design codes and numerical models often overestimate or underestimate the ultimate strengths of CFSST columns. This book presents accurate and efficient computational models for the nonlinear inelastic analysis and design of CFSST short and slender columns under axial load and biaxial bending. The effects of different strain-hardening characteristics of stainless steel in tension and in compression, progressive local and post-local buckling of stainless steel tubes and concrete confinement are taken into account in the computational models. The numerical models simulate the axial load-strain behavior, moment-curvature curves, axial load-deflection responses and axial load-moment strength interaction diagrams of CFSST columns. The book describes the mathematical formulations, computational procedures and model verifications for circular and rectangular CFSST short and slender columns. The behavior of CFSST columns under various loading conditions is demonstrated by numerous numerical examples. This book is written for practising structural and civil engineers, academic researchers and graduate students in civil engineering who are interested in the latest computational techniques and design methods for CFSST columns.
Any study on the historical evolution of nations and countries points out the decisive importance of productivity trends. We are all very familiar with the main evolution which started with a hunting society at the dawn of civilization, then moved to an agricultural society, and quickly to craftsmanship and com merce. The beginning of the industrial society dates back to the end of the eighteenth century in England, with the introduction of the assembly line in the textile and smelting industries. However, in the last few decades, we are becoming more and more acutely aware of the paramount importance of the production of "information". Indeed, according to a few economists today, we should be classified as living in an information society which has superseded the industrial society. At this point it simply becomes necessary to talk about the computer informa tion industry, which is more and more pervading our lives, from the personal computer, to the workstation, to information networks and electronic mail, to the blueprint executed by robots, to the supercomputer necessary in any major scientific and engineering task. The computer has already brought about a momentous change in the production line - less and less man-size, more and more robot-size. But this rush to tech nical innovation has not stopped at this point. Artificial intelligence and expert systems are becoming a more and more important factor for production by many enterprises and activities.
The International Conference on Design and Decision Support Systems in Architecture and Urban Planning is organised bi-annually by the Eindhoven University of Technology. This volume contains a selection of papers presented at the eighth conference that was held at the Kapellerput Conference Centre in the village of Heeze, near Eindhoven, The Netherlands, from 4 to 7 July, 2006. Traditionally, the DDSS conferences aim to be a platform for both starting and experienced researchers who focus on the development and application of computer support in the areas of urban planning and architectural design. This results in an interesting mix of well-established research projects and first explorations. It also leads to a very valuable cross-over of theories, methods, and technologies for support systems in the two different areas, architecture and urban planning. This volume contains 31 peer reviewed papers from this yeara (TM)s conference that are organised into seven sections: a [ Land Use Simulation and Visualisation a [ Multi-Agent Models for Movement Simulation a [ Multi-Agent Models for Urban Development a [ Managing and Deploying Design Knowledge a [ Urban Decision-Making a [ Design Interactivity and Design Automation a [ Virtual Environments and Augmented Reality. This book will bring researchers together and is a valuable resource for their continuous joint effort to improve the design and planning of our environment.
Data and its technologies now play a large and growing role in humanities research and teaching. This book addresses the needs of humanities scholars who seek deeper expertise in the area of data modeling and representation. The authors, all experts in digital humanities, offer a clear explanation of key technical principles, a grounded discussion of case studies, and an exploration of important theoretical concerns. The book opens with an orientation, giving the reader a history of data modeling in the humanities and a grounding in the technical concepts necessary to understand and engage with the second part of the book. The second part of the book is a wide-ranging exploration of topics central for a deeper understanding of data modeling in digital humanities. Chapters cover data modeling standards and the role they play in shaping digital humanities practice, traditional forms of modeling in the humanities and how they have been transformed by digital approaches, ontologies which seek to anchor meaning in digital humanities resources, and how data models inhabit the other analytical tools used in digital humanities research. It concludes with a glossary chapter that explains specific terms and concepts for data modeling in the digital humanities context. This book is a unique and invaluable resource for teaching and practising data modeling in a digital humanities context.
Collecting the work of the foremost scientists in the field, Discrete-Event Modeling and Simulation: Theory and Applications presents the state of the art in modeling discrete-event systems using the discrete-event system specification (DEVS) approach. It introduces the latest advances, recent extensions of formal techniques, and real-world examples of various applications. The book covers many topics that pertain to several layers of the modeling and simulation architecture. It discusses DEVS model development support and the interaction of DEVS with other methodologies. It describes different forms of simulation supported by DEVS, the use of real-time DEVS simulation, the relationship between DEVS and graph transformation, the influence of DEVS variants on simulation performance, and interoperability and composability with emphasis on DEVS standardization. The text also examines extensions to DEVS, new formalisms, and abstractions of DEVS models as well as the theory and analysis behind real-world system identification and control. To support the generation and search of optimal models of a system, a framework is developed based on the system entity structure and its transformation to DEVS simulation models. In addition, the book explores numerous interesting examples that illustrate the use of DEVS to build successful applications, including optical network-on-chip, construction/building design, process control, workflow systems, and environmental models. A one-stop resource on advances in DEVS theory, applications, and methodology, this volume offers a sampling of the best research in the area, a broad picture of the DEVS landscape, and trend-setting applications enabled by the DEVS approach. It provides the basis for future research discoveries and encourages the development of new applications.
This textbook teaches the essential background and skills for understanding and quantifying uncertainties in a computational simulation, and for predicting the behavior of a system under those uncertainties. It addresses a critical knowledge gap in the widespread adoption of simulation in high-consequence decision-making throughout the engineering and physical sciences. Constructing sophisticated techniques for prediction from basic building blocks, the book first reviews the fundamentals that underpin later topics of the book including probability, sampling, and Bayesian statistics. Part II focuses on applying Local Sensitivity Analysis to apportion uncertainty in the model outputs to sources of uncertainty in its inputs. Part III demonstrates techniques for quantifying the impact of parametric uncertainties on a problem, specifically how input uncertainties affect outputs. The final section covers techniques for applying uncertainty quantification to make predictions under uncertainty, including treatment of epistemic uncertainties. It presents the theory and practice of predicting the behavior of a system based on the aggregation of data from simulation, theory, and experiment. The text focuses on simulations based on the solution of systems of partial differential equations and includes in-depth coverage of Monte Carlo methods, basic design of computer experiments, as well as regularized statistical techniques. Code references, in python, appear throughout the text and online as executable code, enabling readers to perform the analysis under discussion. Worked examples from realistic, model problems help readers understand the mechanics of applying the methods. Each chapter ends with several assignable problems. Uncertainty Quantification and Predictive Computational Science fills the growing need for a classroom text for senior undergraduate and early-career graduate students in the engineering and physical sciences and supports independent study by researchers and professionals who must include uncertainty quantification and predictive science in the simulations they develop and/or perform.
This volume offers a valuable starting point for anyone interested in learning computational diffusion MRI and mathematical methods for brain connectivity, while also sharing new perspectives and insights on the latest research challenges for those currently working in the field. Over the last decade, interest in diffusion MRI has virtually exploded. The technique provides unique insights into the microstructure of living tissue and enables in-vivo connectivity mapping of the brain. Computational techniques are key to the continued success and development of diffusion MRI and to its widespread transfer into the clinic, while new processing methods are essential to addressing issues at each stage of the diffusion MRI pipeline: acquisition, reconstruction, modeling and model fitting, image processing, fiber tracking, connectivity mapping, visualization, group studies and inference. These papers from the 2016 MICCAI Workshop "Computational Diffusion MRI" - which was intended to provide a snapshot of the latest developments within the highly active and growing field of diffusion MR - cover a wide range of topics, from fundamental theoretical work on mathematical modeling, to the development and evaluation of robust algorithms and applications in neuroscientific studies and clinical practice. The contributions include rigorous mathematical derivations, a wealth of rich, full-color visualizations, and biologically or clinically relevant results. As such, they will be of interest to researchers and practitioners in the fields of computer science, MR physics, and applied mathematics.
This book covers the methodological, epistemological and practical issues of integrating qualitative and socio-anthropological factors into archaeological modeling. This text fills the gap between conceptual modeling (which usually relies on narratives describing the life of a past community) and formalized/computer-based modeling which are usually environmentally-determined. Methods combining both environmental and social issues through niche and agent-based modeling are presented. These methods help to translate data from paleo-environmental and archaeological society life cycles (such as climate and landscape changes) into the local spatial scale. The epistemological discussions will appeal to readers as well as the resilience socio-anthropological factors provide facing climatic fluctuations. Integrating Qualitative and Social Science Factors in Archaeological Modelling will appeal to students and researchers in the field.
The book provides a self-contained treatment of stochastic finite element methods. It helps the reader to establish a solid background on stochastic and reliability analysis of structural systems and enables practicing engineers to better manage the concepts of analysis and design in the presence of uncertainty. The book covers the basic topics of computational stochastic mechanics focusing on the stochastic analysis of structural systems in the framework of the finite element method. The target audience primarily comprises students in a postgraduate program specializing in structural engineering but the book may also be beneficial to practicing engineers and research experts alike.
This book analyses the impact computerization has had on contemporary science and explains the origins, technical nature and epistemological consequences of the current decisive interplay between technology and science: an intertwining of formalism, computation, data acquisition, data and visualization and how these factors have led to the spread of simulation models since the 1950s. Using historical, comparative and interpretative case studies from a range of disciplines, with a particular emphasis on the case of plant studies, the author shows how and why computers, data treatment devices and programming languages have occasioned a gradual but irresistible and massive shift from mathematical models to computer simulations.
This book offers an overview of some recent advances in the Computational Bioacoustics methods and technology. In the focus of discussion is the pursuit of scalability, which would facilitate real-world applications of different scope and purpose, such as wildlife monitoring, biodiversity assessment, pest population control, and monitoring the spread of disease transmitting mosquitoes. The various tasks of Computational Bioacoustics are described and a wide range of audio parameterization and recognition tasks related to the automated recognition of species and sound events is discussed. Many of the Computational Bioacoustics methods were originally developed for the needs of speech, audio, or image processing, and afterwards were adapted to the requirements of automated acoustic recognition of species, or were elaborated further to address the challenges of real-world operation in 24/7 mode. The interested reader is encouraged to follow the numerous references and links to web resources for further information and insights. This book is addressed to Software Engineers, IT experts, Computer Science researchers, Bioacousticians, and other practitioners concerned with the creation of new tools and services, aimed at enhancing the technological support to Computational Bioacoustics applications. STTM, Speech Technology and Text Mining in Medicine and Health Care This series demonstrates how the latest advances in speech technology and text mining positively affect patient healthcare and, in a much broader sense, public health at large. New developments in text mining methods have allowed health care providers to monitor a large population of patients at any time and from any location. Employing advanced summarization techniques, patient data can be readily extracted from extensive clinical documents in electronic health records and immediately made available to the physician. These same summarization techniques can also aid the healthcare provider in extracting from the large corpora of medical literature the relevant information for treating the patient. The series topics include the design and acceptance of speech-enabled robots that assist in the operating room, studies of signal processing and acoustic modeling for speech and communication disorders, advanced statistical speech enhancement methods for creating synthetic voice, and technologies for addressing speech and language impairments. Titles in the Series consist of both authored books and edited contributions. All authored books and contributed works are peer-reviewed. The Series is for speech scientists and speech engineers, machine learning experts, biomedical engineers, medical speech pathologists, linguists, and healthcare professionals |
You may like...
The Gift of Who I Am - Living Prayer…
Christine Black Cummings
Paperback
|