![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Medicine > General issues > Medical equipment & techniques > General
The aim of this book is to present a range of analytical methods that can be used in formulation design and development and focus on how these systems can be applied to understand formulation components and the dosage form these build. To effectively design and exploit drug delivery systems, the underlying characteristic of a dosage form must be understood--from the characteristics of the individual formulation components, to how they act and interact within the formulation, and finally, to how this formulation responds in different biological environments. To achieve this, there is a wide range of analytical techniques that can be adopted to understand and elucidate the mechanics of drug delivery and drug formulation. Such methods include e.g. spectroscopic analysis, diffractometric analysis, thermal investigations, surface analytical techniques, particle size analysis, rheological techniques, methods to characterize drug stability and release, and biological analysis in appropriate cell and animal models. Whilst each of these methods can encompass a full research area in their own right, formulation scientists must be able to effectively apply these methods to the delivery system they are considering. The information in this book is designed to support researchers in their ability to fully characterize and analyze a range of delivery systems, using an appropriate selection of analytical techniques. Due to its consideration of regulatory approval, this book will also be suitable for industrial researchers both at early stage up to pre-clinical research.
This book reviews existing sensor technologies that are now being coupled with computational intelligence for the remote monitoring of physical activity and ex vivo biosignatures. In today's frenetic world, consumers are becoming ever more demanding: they want to control every aspect of their lives and look for options specifically tailored to their individual needs. In many cases, suppliers are catering to these new demands; as a result, clothing, food, social media, fitness and banking services are all being democratised to the individual. Healthcare provision has finally caught up to this trend and is currently being rebooted to offer personalised solutions, while simultaneously creating a more effective, scalable and cost-effective system for all. The desire for personalisation, home monitoring and treatment, and provision of care in remote locations or in emerging and impoverished nations that lack a fixed infrastructure, is leading to the realisation that mobile technology might be the best candidate for achieving these goals. A combination of several technological, healthcare and financial factors are driving this trend to create a new healthcare model that stresses preventative 'health-care' rather than 'sick-care', and a shift from volume to value. Mobile healthcare (mhealth), which could also be termed the "internet of people", refers to the integration of sensors and smartphones to gather and interpret clinical data from patients in real-time. Most importantly, with an ageing population suffering multiple morbidities, mhealth could provide healthcare solutions to enhance chronically ill patients' quality of life.
This handbook brings together a variety of approaches to the uses of big data in multiple fields, primarily science, medicine, and business. This single resource features contributions from researchers around the world from a variety of fields, where they share their findings and experience. This book is intended to help spur further innovation in big data. The research is presented in a way that allows readers, regardless of their field of study, to learn from how applications have proven successful and how similar applications could be used in their own field. Contributions stem from researchers in fields such as physics, biology, energy, healthcare, and business. The contributors also discuss important topics such as fraud detection, privacy implications, legal perspectives, and ethical handling of big data.
The book presents a knowledge discovery based approach to build a recommender system supporting a physician in treating tinnitus patients with the highly successful method called Tinnitus Retraining Therapy. It describes experiments on extracting novel knowledge from the historical dataset of patients treated by Dr. P. Jastreboff so that to better understand factors behind therapy's effectiveness and better personalize treatments for different profiles of patients. The book is a response for a growing demand of an advanced data analytics in the healthcare industry in order to provide better care with the data driven decision-making solutions. The potential economic benefits of applying computerized clinical decision support systems include not only improved efficiency in health care delivery (by reducing costs, improving quality of care and patient safety), but also enhancement in treatment's standardization, objectivity and availability in places of scarce expert's knowledge on this difficult to treat hearing disorder. Furthermore, described approach could be used in assessment of the clinical effectiveness of evidence-based intervention of various proposed treatments for tinnitus.
This book presents a comprehensive and up-to-date treatise of a range of methodological and algorithmic issues. It also discusses implementations and case studies, identifies the best design practices, and assesses data analytics business models and practices in industry, health care, administration and business.Data science and big data go hand in hand and constitute a rapidly growing area of research and have attracted the attention of industry and business alike. The area itself has opened up promising new directions of fundamental and applied research and has led to interesting applications, especially those addressing the immediate need to deal with large repositories of data and building tangible, user-centric models of relationships in data. Data is the lifeblood of today's knowledge-driven economy.Numerous data science models are oriented towards end users and along with the regular requirements for accuracy (which are present in any modeling), come the requirements for ability to process huge and varying data sets as well as robustness, interpretability, and simplicity (transparency). Computational intelligence with its underlying methodologies and tools helps address data analytics needs.The book is of interest to those researchers and practitioners involved in data science, Internet engineering, computational intelligence, management, operations research, and knowledge-based systems.
This clear-sighted volume introduces the concept of "disruptive cooperation"- transformative partnerships between the health and technology sectors to eliminate widespread healthcare problems such as inequities, waste, and inappropriate care. Emphasizing the most pressing issues of a world growing older with long-term chronic illness, it unveils a new framework for personalized, integrative service based in mobile technologies. Coverage analyzes social aspects of illness and health, clinically robust uses of health data, and wireless and wearable applications in intervention, prevention, and health promotion. And case studies from digital health innovators illustrate opportunities for coordinating the service delivery, business, research/science, and policy sectors to promote healthier aging worldwide. Included among the topics: Cooperation in aging services technologies The quantified self, wearables, and the tracking revolution Smart healthy cities: public-private partnerships Beyond silos to data analytics for population health Cooperation for building secure standards for health data Peer-to-peer platforms for physicians in underserved areas: a human rights approach to social media in medicine Disruptive Cooperation in Digital Health will energize digital health and healthcare professionals in both non-profit and for-profit settings. Policymakers and public health professionals with an interest in innovation policy should find it an inspiring ideabook.
This textbook on practical data analytics unites fundamental principles, algorithms, and data. Algorithms are the keystone of data analytics and the focal point of this textbook. Clear and intuitive explanations of the mathematical and statistical foundations make the algorithms transparent. But practical data analytics requires more than just the foundations. Problems and data are enormously variable and only the most elementary of algorithms can be used without modification. Programming fluency and experience with real and challenging data is indispensable and so the reader is immersed in Python and R and real data analysis. By the end of the book, the reader will have gained the ability to adapt algorithms to new problems and carry out innovative analyses. This book has three parts:(a) Data Reduction: Begins with the concepts of data reduction, data maps, and information extraction. The second chapter introduces associative statistics, the mathematical foundation of scalable algorithms and distributed computing. Practical aspects of distributed computing is the subject of the Hadoop and MapReduce chapter.(b) Extracting Information from Data: Linear regression and data visualization are the principal topics of Part II. The authors dedicate a chapter to the critical domain of Healthcare Analytics for an extended example of practical data analytics. The algorithms and analytics will be of much interest to practitioners interested in utilizing the large and unwieldly data sets of the Centers for Disease Control and Prevention's Behavioral Risk Factor Surveillance System.(c) Predictive Analytics Two foundational and widely used algorithms, k-nearest neighbors and naive Bayes, are developed in detail. A chapter is dedicated to forecasting. The last chapter focuses on streaming data and uses publicly accessible data streams originating from the Twitter API and the NASDAQ stock market in the tutorials. This book is intended for a one- or two-semester course in data analytics for upper-division undergraduate and graduate students in mathematics, statistics, and computer science. The prerequisites are kept low, and students with one or two courses in probability or statistics, an exposure to vectors and matrices, and a programming course will have no difficulty. The core material of every chapter is accessible to all with these prerequisites. The chapters often expand at the close with innovations of interest to practitioners of data science. Each chapter includes exercises of varying levels of difficulty. The text is eminently suitable for self-study and an exceptional resource for practitioners.
This comprehensive book focuses on better big-data security for healthcare organizations. Following an extensive introduction to the Internet of Things (IoT) in healthcare including challenging topics and scenarios, it offers an in-depth analysis of medical body area networks with the 5th generation of IoT communication technology along with its nanotechnology. It also describes a novel strategic framework and computationally intelligent model to measure possible security vulnerabilities in the context of e-health. Moreover, the book addresses healthcare systems that handle large volumes of data driven by patients' records and health/personal information, including big-data-based knowledge management systems to support clinical decisions. Several of the issues faced in storing/processing big data are presented along with the available tools, technologies and algorithms to deal with those problems as well as a case study in healthcare analytics. Addressing trust, privacy, and security issues as well as the IoT and big-data challenges, the book highlights the advances in the field to guide engineers developing different IoT devices and evaluating the performance of different IoT techniques. Additionally, it explores the impact of such technologies on public, private, community, and hybrid scenarios in healthcare. This book offers professionals, scientists and engineers the latest technologies, techniques, and strategies for IoT and big data.
Aimed at research scientists and biotechnologists, this book is an essential reading for those working with extremophiles and their potential biotechnological application. Here, we provide a comprehensive and reliable source of information on the recent advances and challenges in different aspects of the theme. Written in an accessible language, the book is also a recommended as reference text for anyone interested in this thriving field of research. Over the last decades, the study of extremophiles has provided ground breaking discoveries that challenge our understanding of biochemistry and molecular biology. In the applied side, extremophiles and their enzymes have spawned a multibillion dollar biotechnology industry, with applications spanning biomedical, pharmaceutical, industrial, environmental, and agricultural sectors. Taq DNA polymerase (which was isolated from Thermus aquaticus from a geothermal spring in Yellowstone National Park) is the most well-known example of the potential biotechnological application of extremophiles and their biomolecules. Indeed, the application of extremophiles and their biologically active compounds has opened a new era in biotechnology. However, despite the latest advances, we are just in the beginning of exploring the biotechnological potentials of extremophiles.
This book is open access under a CC BY-NC 2.5 license. This book presents the VISCERAL project benchmarks for analysis and retrieval of 3D medical images (CT and MRI) on a large scale, which used an innovative cloud-based evaluation approach where the image data were stored centrally on a cloud infrastructure and participants placed their programs in virtual machines on the cloud. The book presents the points of view of both the organizers of the VISCERAL benchmarks and the participants. The book is divided into five parts. Part I presents the cloud-based benchmarking and Evaluation-as-a-Service paradigm that the VISCERAL benchmarks used. Part II focuses on the datasets of medical images annotated with ground truth created in VISCERAL that continue to be available for research. It also covers the practical aspects of obtaining permission to use medical data and manually annotating 3D medical images efficiently and effectively. The VISCERAL benchmarks are described in Part III, including a presentation and analysis of metrics used in evaluation of medical image analysis and search. Lastly, Parts IV and V present reports by some of the participants in the VISCERAL benchmarks, with Part IV devoted to the anatomy benchmarks and Part V to the retrieval benchmark. This book has two main audiences: the datasets as well as the segmentation and retrieval results are of most interest to medical imaging researchers, while eScience and computational science experts benefit from the insights into using the Evaluation-as-a-Service paradigm for evaluation and benchmarking on huge amounts of data.
This book provides an in-depth review of state-of-the-art orthopaedic techniques and basic mechanical operations (drilling, boring, cutting, grinding/milling) involved in present day orthopaedic surgery. Casting a light on exploratory hybrid operations, as well as non-conventional techniques such as laser assisted operations, this book further extends the discussion to include physical aspects of the surgery in view of material (bone) and process parameters. Featuring detailed discussion of the computational modeling of forces (mechanical and thermal) involved in surgical procedures for the planning and optimization of the process/procedure and system development, this book lays the foundations for efforts towards the future development of improved orthopaedic surgery. With topics including the role of bone machining during surgical operations; the physical properties of the bone which influence the response to any machining operation, and robotic automation, this book will be a valuable and comprehensive literature source for years to come.
This textbook teaches advanced undergraduate and first-year graduate students in Engineering and Applied Sciences to gather and analyze empirical observations (data) in order to aid in making design decisions. While science is about discovery, the primary paradigm of engineering and "applied science" is design. Scientists are in the discovery business and want, in general, to understand the natural world rather than to alter it. In contrast, engineers and applied scientists design products, processes, and solutions to problems. That said, statistics, as a discipline, is mostly oriented toward the discovery paradigm. Young engineers come out of their degree programs having taken courses such as "Statistics for Engineers and Scientists" without any clear idea as to how they can use statistical methods to help them design products or processes. Many seem to think that statistics is only useful for demonstrating that a device or process actually does what it was designed to do. Statistics courses emphasize creating predictive or classification models - predicting nature or classifying individuals, and statistics is often used to prove or disprove phenomena as opposed to aiding in the design of a product or process. In industry however, Chemical Engineers use designed experiments to optimize petroleum extraction; Manufacturing Engineers use experimental data to optimize machine operation; Industrial Engineers might use data to determine the optimal number of operators required in a manual assembly process. This text teaches engineering and applied science students to incorporate empirical investigation into such design processes. Much of the discussion in this book is about models, not whether the models truly represent reality but whether they adequately represent reality with respect to the problems at hand; many ideas focus on how to gather data in the most efficient way possible to construct adequate models. Includes chapters on subjects not often seen together in a single text (e.g., measurement systems, mixture experiments, logistic regression, Taguchi methods, simulation) Techniques and concepts introduced present a wide variety of design situations familiar to engineers and applied scientists and inspire incorporation of experimentation and empirical investigation into the design process. Software is integrally linked to statistical analyses with fully worked examples in each chapter; fully worked using several packages: SAS, R, JMP, Minitab, and MS Excel - also including discussion questions at the end of each chapter. The fundamental learning objective of this textbook is for the reader to understand how experimental data can be used to make design decisions and to be familiar with the most common types of experimental designs and analysis methods.
Looking beyond the communications technology horizon and projecting future competency-specific employment demand, this book presents an evaluation of desirable information systems enhancements by integrating two disparate-domain computer ontologies. It provides readers a fresh solutions approach based on dynamic modeling and methodological contributions to philosophical and assistive communications system development in healthcare, addressing the need for both demand intelligence and practical work environment support. The pace of change in redefining occupation-specific employee resourcing needs is unrelenting and continues to accelerate. And the exponential growth in the demand for healthcare service delivery is correspondingly daunting. As such, the public and private sectors are faced with the challenge of sustaining credible relevant demand intelligence and recruitment practices, while integration, expansion and enrichment of ostensibly unconnected ontologies represent key R&D issues.
This book focuses on the development and use of interoperability standards related to healthcare information technology (HIT) and provides in-depth discussion of the associated essential aspects. The book explains the principles of conformance, examining how to improve the content of healthcare data exchange standards (including HL7 v2.x, V3/CDA, FHIR, CTS2, DICOM, EDIFACT, and ebXML), the rigor of conformance testing, and the interoperability capabilities of healthcare applications for the benefit of healthcare professionals who use HIT, developers of HIT applications, and healthcare consumers who aspire to be recipients of safe and effective health services facilitated through meaningful use of well-designed HIT. Readers will understand the common terms interoperability, conformance, compliance and compatibility, and be prepared to design and implement their own complex interoperable healthcare information system. Chapters address the practical aspects of the subject matter to enable application of previously theoretical concepts. The book provides real-world, concrete examples to explain how to apply the information, and includes many diagrams to illustrate relationships of entities and concepts described in the text. Designed for professionals and practitioners, this book is appropriate for implementers and developers of HIT, technical staff of information technology vendors participating in the development of standards and profiling initiatives, informatics professionals who design conformance testing tools, staff of information technology departments in healthcare institutions, and experts involved in standards development. Healthcare providers and leadership of provider organizations seeking a better understanding of conformance, interoperability, and IT certification processes will benefit from this book, as will students studying healthcare information technology.
This book provides a complete overview of significant design challenges in respect to circuit miniaturization and power reduction of the neural recording system, along with circuit topologies, architecture trends, and (post-silicon) circuit optimization algorithms. The introduced novel circuits for signal conditioning, quantization, and classification, as well as system configurations focus on optimized power-per-area performance, from the spatial resolution (i.e. number of channels), feasible wireless data bandwidth and information quality to the delivered power of implantable system.
This publication is sponsored by the American Association for Medical Systems and Informatics. The Board of AAMSI and the Board of the Society for Computer Medicine, one of AAMSI's predecessors, agreed that a book on application of medical systems and informatics for the practitioner would help promote high quality health care and they charged the Committee on Standards of the Society for Computer Medicine to write such a text. It is intended as a guide for the field of medical systems and informatics with emphasis on standards, terminology, and coding systems. The text, a result of three years of research and effort, has been reviewed by the Board of Directors of AAMSI and approved by the Publications Committee. We believe that you will find it valuable and hope to revise it from time to time to meet current needs. On behalf of the members of the Association, we congratulate the authors and thank them for their efforts. WILLIAM A. BAUMAN, M.D. President American Association for Medical Systems and Informatics Preface This book has been written by the members of the Committee on Standards of the Society for Computer Medicine. We have drawn upon the Society's expertise to prepare an easy-to-read and understandable How-to Do-It text for use by those physicians who are considering computerization of their office in one manner or another."
This book highlights the interdisciplinary study of cognition, mind and behavior from an information processing perspective, and describes related applications to health informatics. The respective chapters address health problem-solving and education, decision support systems, user-centered interfaces, and the design and use of controlled medical terminologies. Reflecting cutting-edge research on computational methods - including theory, algorithms, numerical simulation, error and uncertainty analysis, and their applications - the book offers a valuable resource for doctoral students and researchers in the fields of Computer Science and Engineering.
A Clinical Information System for Oncology describes a medical information system designed and implemented in a cancer center but with broad applicability to medical practice beyond the cancer center environment in both inpatient and outpatient settings. Regarded as forward looking in 1978, the system has the distinction of still being in production. Indeed, its functionality has continued to grow and its technical implementation to evolve with the changing technology over the last decade. The authors detail the functions supported by this unique system, illustrate how it assists in the care process, review its development history, and evaluate its impact on the delivery of care in terms of cost, user satisfaction, and efficacy. Unlike much information technology, the system is an active participant in medical decision making: it includes comprehensive tools for managing and displaying clinical data; automatically produces care plans from protocols; and features unique tools which support the effective use of blood products. Professionals in medical informatics, hospital administrators, and physicians will find this book a valuable addition to their professional library.
This two volume set LNBI 10813 and LNBI 10814 constitutes the proceedings of the 6th International Work-Conference on Bioinformatics and Biomedical Engineering, IWBBIO 2018, held in Granada, Spain, in April 2018.The 88 regular papers presented were carefully reviewed and selected from 273 submissions. The scope of the conference spans the following areas: bioinformatics for healthcare and diseases; bioinformatics tools to integrate omics dataset and address biological question; challenges and advances in measurement and self-parametrization of complex biological systems; computational genomics; computational proteomics; computational systems for modelling biological processes; drug delivery system design aided by mathematical modelling and experiments; generation, management and biological insights from big data; high-throughput bioinformatic tools for medical genomics; next generation sequencing and sequence analysis; interpretable models in biomedicine and bioinformatics; little-big data. Reducing the complexity and facing uncertainty of highly underdetermined phenotype prediction problems; biomedical engineering; biomedical image analysis; biomedical signal analysis; challenges in smart and wearable sensor design for mobile health; and healthcare and diseases.
This work represents an inventive attempt to apply recent advances in nanotechnology to identify and characterise novel polymer systems for drug delivery through the skin. Atomic force microscopy (AFM) measurements of the nanoscale mechanical properties of topical, drug-containing polymeric films enabled the author to identify optimal compositions, in terms of flexibility and substantivity, for application to the skin. To elucidate the enhanced drug release from polyacrylate films incorporating medium chain triglycerides, the author combined AFM studies with the complementary technique of Raman micro-spectroscopy. This experimental strategy revealed that the significant increase in the drug released from these films is the result of a nanoscale two-phase structure. Finally, in experiments examining the microporation of skin using femtosecond laser ablation, the author demonstrated that the threshold at which the skin's barrier function is undermined can be dramatically reduced by the pre-application of ink. The approach allows thermal damage at the pore edge to be minimised, suggesting a very real potential for substantially increasing drug delivery in a minimally invasive fashion.
This book equips students with a thorough understanding of various types of sensors and biosensors that can be used for chemical, biological, and biomedical applications, including but not limited to temperature sensors, strain sensor, light sensors, spectrophotometric sensors, pulse oximeter, optical fiber probes, fluorescence sensors, pH sensor, ion-selective electrodes, piezoelectric sensors, glucose sensors, DNA and immunosensors, lab-on-a-chip biosensors, paper-based lab-on-a-chip biosensors, and microcontroller-based sensors. The author treats the study of biosensors with an applications-based approach, including over 15 extensive, hands-on labs given at the end of each chapter. The material is presented using a building-block approach, beginning with the fundamentals of sensor design and temperature sensors, and ending with more complicated biosensors.New to this second edition are sections on op-amp filters, pulse oximetry, meat quality monitoring, advanced fluorescent dyes, autofluorescence, various fluorescence detection methods, fluoride ion-selective electrode, advanced glucose sensing methods including continuous glucose monitoring, paper-based lab-on-a-chip, etc. A new chapter on nano-biosensors and an appendix on microcontrollers make this textbook ideal for undergraduate engineering students studying biosensors. It can also serve as a hands-on guide for scientists and engineers working in the sensor or biosensor industries.
This book constitutes the refereed proceedings of the 6th International Conference on Health Information Science, HIS 2017, held in Moscow, Russia, in October 2017. The 11 full papers and 7 short papers presented were carefully reviewed and selected from 44 submissions. The papers feature multidisciplinary research results in health information science and systems that support health information management and health service delivery. They relate to all aspects of the conference scope, such as medical/health/biomedicine information resources such as patient medical records, devices and equipments, software and tools to capture, store, retrieve, process, analyze, and optimize the use of information in the health domain; data management, data mining, and knowledge discovery, management of publichealth, examination of standards, privacy and security issues; computer visualization and artificial intelligence for computer aided diagnosis; development of new architectures and applications for health information systems.
This book provides a review of essential research on urinary tract infections (UTIs), as well as a broader perspective on methodologies adopted for the isolation and identification of the bacteria from urine samples of pregnant and non-pregnant women on the basis of their cultural, morphological and biochemical characteristics. The identification is extended to the strain level by means of molecular identification involving BLAST as a bioinformatics tool. The book also addresses the roles of various other bioinformatics tools for tracing the phylogenetic tree and conservation studies among the bacteriocin of the identified bacteria. Lastly, it assesses the antibiotics resistance patterns of these isolates.
This book constitutes the thoroughly refereed post-conference proceedings of the International Conference for Smart Health, ICSH 2017, held in Hong Kong, China,in June 2017.The 18 full papers and 13 short papers presented were carefully reviewed and selectedfrom 38 submissions. They focus on studies on the principles, approaches, models, frameworks, new applications, and effects of using novel information technology to address healthcare problems and improve social welfare.
This book constitutes the refereed joint proceedings of the First International Workshop on Graphs in Biomedical Image Analysis, GRAIL 2017, the 6th International Workshop on Mathematical Foundations of Computational Anatomy, MFCA 2017, and the Third International Workshop on Imaging Genetics, MICGen 2017, held in conjunction with the 20th International Conference on Medical Imaging and Computer-Assisted Intervention, MICCAI 2017, in Quebec City, QC, Canada, in September 2017. The 7 full papers presented at GRAIL 2017, the 10 full papers presented at MFCA 2017, and the 5 full papers presented at MICGen 2017 were carefully reviewed and selected. The GRAIL papers cover a wide range of graph based medical image analysis methods and applications, including probabilistic graphical models, neuroimaging using graph representations, machine learning for diagnosis prediction, and shape modeling. The MFCA papers deal with theoretical developments in non-linear image and surface registration in the context of computational anatomy. The MICGen papers cover topics in the field of medical genetics, computational biology and medical imaging. |
You may like...
Medical Devices - Use and Safety
Bertil Jacobson, Alan Murray
Paperback
R1,006
Discovery Miles 10 060
Radiomics and Its Clinical Application…
Jie Tian, Di Dong, …
Paperback
R2,536
Discovery Miles 25 360
Intelligent Materials for Controlled…
Steven M Dinh, John DeNuzzio, …
Hardcover
R2,327
Discovery Miles 23 270
Multi-Criteria Decision-Making Sorting…
Luis Martinez Lopez, Alessio Ishizaka, …
Paperback
R2,948
Discovery Miles 29 480
|