Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Medicine > General issues > Medical equipment & techniques > General
This book is open access under a CC BY-NC 2.5 license. This book presents the VISCERAL project benchmarks for analysis and retrieval of 3D medical images (CT and MRI) on a large scale, which used an innovative cloud-based evaluation approach where the image data were stored centrally on a cloud infrastructure and participants placed their programs in virtual machines on the cloud. The book presents the points of view of both the organizers of the VISCERAL benchmarks and the participants. The book is divided into five parts. Part I presents the cloud-based benchmarking and Evaluation-as-a-Service paradigm that the VISCERAL benchmarks used. Part II focuses on the datasets of medical images annotated with ground truth created in VISCERAL that continue to be available for research. It also covers the practical aspects of obtaining permission to use medical data and manually annotating 3D medical images efficiently and effectively. The VISCERAL benchmarks are described in Part III, including a presentation and analysis of metrics used in evaluation of medical image analysis and search. Lastly, Parts IV and V present reports by some of the participants in the VISCERAL benchmarks, with Part IV devoted to the anatomy benchmarks and Part V to the retrieval benchmark. This book has two main audiences: the datasets as well as the segmentation and retrieval results are of most interest to medical imaging researchers, while eScience and computational science experts benefit from the insights into using the Evaluation-as-a-Service paradigm for evaluation and benchmarking on huge amounts of data.
This book introduces medical imaging, its security requirements, and various security mechanisms using data hiding approaches. The book in particular provides medical data hiding techniques using various advanced image transforms and encryption methods. The book focuses on two types of data hiding techniques: steganography and watermarking for medical images. The authors show how these techniques are used for security and integrity verification of medical images and designed for various types of medical images such as grayscale image and color image. The implementation of techniques are done using discrete cosine transform (DCT), discrete wavelet transform (DWT), singular value decomposition (SVD), redundant DWT (RDWT), fast discrete curvelet transform (FDCuT), finite ridgelet transform (FRT) and non-subsampled contourlet transform (NSCT). The results of these techniques are also demonstrated after description of each technique. Finally, some future research directions are provided for security of medical images in telemedicine application.
Looking beyond the communications technology horizon and projecting future competency-specific employment demand, this book presents an evaluation of desirable information systems enhancements by integrating two disparate-domain computer ontologies. It provides readers a fresh solutions approach based on dynamic modeling and methodological contributions to philosophical and assistive communications system development in healthcare, addressing the need for both demand intelligence and practical work environment support. The pace of change in redefining occupation-specific employee resourcing needs is unrelenting and continues to accelerate. And the exponential growth in the demand for healthcare service delivery is correspondingly daunting. As such, the public and private sectors are faced with the challenge of sustaining credible relevant demand intelligence and recruitment practices, while integration, expansion and enrichment of ostensibly unconnected ontologies represent key R&D issues.
Aimed at research scientists and biotechnologists, this book is an essential reading for those working with extremophiles and their potential biotechnological application. Here, we provide a comprehensive and reliable source of information on the recent advances and challenges in different aspects of the theme. Written in an accessible language, the book is also a recommended as reference text for anyone interested in this thriving field of research. Over the last decades, the study of extremophiles has provided ground breaking discoveries that challenge our understanding of biochemistry and molecular biology. In the applied side, extremophiles and their enzymes have spawned a multibillion dollar biotechnology industry, with applications spanning biomedical, pharmaceutical, industrial, environmental, and agricultural sectors. Taq DNA polymerase (which was isolated from Thermus aquaticus from a geothermal spring in Yellowstone National Park) is the most well-known example of the potential biotechnological application of extremophiles and their biomolecules. Indeed, the application of extremophiles and their biologically active compounds has opened a new era in biotechnology. However, despite the latest advances, we are just in the beginning of exploring the biotechnological potentials of extremophiles.
This book focuses on the development and use of interoperability standards related to healthcare information technology (HIT) and provides in-depth discussion of the associated essential aspects. The book explains the principles of conformance, examining how to improve the content of healthcare data exchange standards (including HL7 v2.x, V3/CDA, FHIR, CTS2, DICOM, EDIFACT, and ebXML), the rigor of conformance testing, and the interoperability capabilities of healthcare applications for the benefit of healthcare professionals who use HIT, developers of HIT applications, and healthcare consumers who aspire to be recipients of safe and effective health services facilitated through meaningful use of well-designed HIT. Readers will understand the common terms interoperability, conformance, compliance and compatibility, and be prepared to design and implement their own complex interoperable healthcare information system. Chapters address the practical aspects of the subject matter to enable application of previously theoretical concepts. The book provides real-world, concrete examples to explain how to apply the information, and includes many diagrams to illustrate relationships of entities and concepts described in the text. Designed for professionals and practitioners, this book is appropriate for implementers and developers of HIT, technical staff of information technology vendors participating in the development of standards and profiling initiatives, informatics professionals who design conformance testing tools, staff of information technology departments in healthcare institutions, and experts involved in standards development. Healthcare providers and leadership of provider organizations seeking a better understanding of conformance, interoperability, and IT certification processes will benefit from this book, as will students studying healthcare information technology.
The aim of this book is to present a range of analytical methods that can be used in formulation design and development and focus on how these systems can be applied to understand formulation components and the dosage form these build. To effectively design and exploit drug delivery systems, the underlying characteristic of a dosage form must be understood--from the characteristics of the individual formulation components, to how they act and interact within the formulation, and finally, to how this formulation responds in different biological environments. To achieve this, there is a wide range of analytical techniques that can be adopted to understand and elucidate the mechanics of drug delivery and drug formulation. Such methods include e.g. spectroscopic analysis, diffractometric analysis, thermal investigations, surface analytical techniques, particle size analysis, rheological techniques, methods to characterize drug stability and release, and biological analysis in appropriate cell and animal models. Whilst each of these methods can encompass a full research area in their own right, formulation scientists must be able to effectively apply these methods to the delivery system they are considering. The information in this book is designed to support researchers in their ability to fully characterize and analyze a range of delivery systems, using an appropriate selection of analytical techniques. Due to its consideration of regulatory approval, this book will also be suitable for industrial researchers both at early stage up to pre-clinical research.
This textbook teaches advanced undergraduate and first-year graduate students in Engineering and Applied Sciences to gather and analyze empirical observations (data) in order to aid in making design decisions. While science is about discovery, the primary paradigm of engineering and "applied science" is design. Scientists are in the discovery business and want, in general, to understand the natural world rather than to alter it. In contrast, engineers and applied scientists design products, processes, and solutions to problems. That said, statistics, as a discipline, is mostly oriented toward the discovery paradigm. Young engineers come out of their degree programs having taken courses such as "Statistics for Engineers and Scientists" without any clear idea as to how they can use statistical methods to help them design products or processes. Many seem to think that statistics is only useful for demonstrating that a device or process actually does what it was designed to do. Statistics courses emphasize creating predictive or classification models - predicting nature or classifying individuals, and statistics is often used to prove or disprove phenomena as opposed to aiding in the design of a product or process. In industry however, Chemical Engineers use designed experiments to optimize petroleum extraction; Manufacturing Engineers use experimental data to optimize machine operation; Industrial Engineers might use data to determine the optimal number of operators required in a manual assembly process. This text teaches engineering and applied science students to incorporate empirical investigation into such design processes. Much of the discussion in this book is about models, not whether the models truly represent reality but whether they adequately represent reality with respect to the problems at hand; many ideas focus on how to gather data in the most efficient way possible to construct adequate models. Includes chapters on subjects not often seen together in a single text (e.g., measurement systems, mixture experiments, logistic regression, Taguchi methods, simulation) Techniques and concepts introduced present a wide variety of design situations familiar to engineers and applied scientists and inspire incorporation of experimentation and empirical investigation into the design process. Software is integrally linked to statistical analyses with fully worked examples in each chapter; fully worked using several packages: SAS, R, JMP, Minitab, and MS Excel - also including discussion questions at the end of each chapter. The fundamental learning objective of this textbook is for the reader to understand how experimental data can be used to make design decisions and to be familiar with the most common types of experimental designs and analysis methods.
ITiB'2018 is the 6th Conference on Information Technology in Biomedicine, hosted every two years by the Department of Informatics & Medical Devices, Faculty of Biomedical Engineering, Silesian University of Technology. The Conference is organized under the auspices of the Committee on Biocybernetics and Biomedical Engineering of the Polish Academy of Sciences. The meeting has become an established event that helps to address the demand for fast and reliable technologies capable of processing data and delivering results in a user-friendly, timely and mobile manner. Many of these areas are recognized as research and development frontiers in employing new technology in the clinical setting. Technological assistance can be found in prevention, diagnosis, treatment, and rehabilitation alike. Homecare support for any type of disability may improve standard of living and make people's lives safer and more comfortable. The book includes the following sections: O Image Processing O Multimodal Imaging and Computer-aided Surgery O Computer-aided Diagnosis O Signal Processing and Medical Devices O Bioinformatics O Modelling & Simulation O Analytics in Action on the SAS Platform O Assistive Technologies and Affective Computing (ATAC)
Cross-over trials are an important class of design used in the pharmaceutical industry and medical research, and their use continues to grow. Cross-over Trials in Clinical Research, Second Edition has been fully updated to include the latest methodology used in the design and analysis of cross-over trials. It includes more background material, greater coverage of important statistical techniques, including Bayesian methods, and discussion of analysis using a number of statistical software packages.
This book provides an in-depth review of state-of-the-art orthopaedic techniques and basic mechanical operations (drilling, boring, cutting, grinding/milling) involved in present day orthopaedic surgery. Casting a light on exploratory hybrid operations, as well as non-conventional techniques such as laser assisted operations, this book further extends the discussion to include physical aspects of the surgery in view of material (bone) and process parameters. Featuring detailed discussion of the computational modeling of forces (mechanical and thermal) involved in surgical procedures for the planning and optimization of the process/procedure and system development, this book lays the foundations for efforts towards the future development of improved orthopaedic surgery. With topics including the role of bone machining during surgical operations; the physical properties of the bone which influence the response to any machining operation, and robotic automation, this book will be a valuable and comprehensive literature source for years to come.
This book will enable practitioners to understand the many complex intricacies of immunohistochemistry (IHC) and make best use of this powerful analytical tool. Providing a thorough grounding in the fundamentals of immunohistochemistry, the book includes several chapters on robotics and automation technology, giving key information on the design of machines and tips to maximise workflow efficiencies. The relationship between IHC and molecular pathology is explained clearly, demonstrating the increasing impact on personalized medicine and targeted therapies for cancer patients. The staining protocol is deconstructed, allowing the reader to adapt it for a variety of diagnostic and research applications. Written by experts at the forefront of hospital immunohistochemistry, there is a strong emphasis on practical guidance on a range of techniques as well as troubleshooting of common problems driven by the authors' experiences. Extensively illustrated with high-quality colour images, this is an invaluable resource to all pathology practitioners utilising the technique.
Cross-sectoral interaction and cooperation in the communication of nutritional health risks represents a strategic research area among national governments and international health authorities. The key research question this book addresses is whether and how different industrial sectors interact with each other in the communication and industrial utilisation of health research findings. Through the introduction and exploration of large-scale industry news and digital media resources, this book systematically analyses a range of digital news genres and identifies new and growing trends of inter-sectoral interaction around the communication of nutritional health in the Chinese language at both international and national levels. This book argues that cross-sectoral interaction can be explored to identify areas that require policy intervention to increase the efficiency and effectiveness of current health communication and promotion. Inter-sectoral interaction can also provide incentives to develop new social programmes and business models to innovate and transform traditional industrial sectors.
In this book, leading authors in the field discuss developments of Ambient Assisted Living. The contributions have been chosen and invited at the 8th AAL Congress, Frankfurt/M. The meeting presents new technological developments which support the autonomy and independence of individuals with special needs. The 8th AAL Congress focusses its attention on technical assistance systems and their applications in homecare, health and care.
This work represents an inventive attempt to apply recent advances in nanotechnology to identify and characterise novel polymer systems for drug delivery through the skin. Atomic force microscopy (AFM) measurements of the nanoscale mechanical properties of topical, drug-containing polymeric films enabled the author to identify optimal compositions, in terms of flexibility and substantivity, for application to the skin. To elucidate the enhanced drug release from polyacrylate films incorporating medium chain triglycerides, the author combined AFM studies with the complementary technique of Raman micro-spectroscopy. This experimental strategy revealed that the significant increase in the drug released from these films is the result of a nanoscale two-phase structure. Finally, in experiments examining the microporation of skin using femtosecond laser ablation, the author demonstrated that the threshold at which the skin's barrier function is undermined can be dramatically reduced by the pre-application of ink. The approach allows thermal damage at the pore edge to be minimised, suggesting a very real potential for substantially increasing drug delivery in a minimally invasive fashion.
This book equips students with a thorough understanding of various types of sensors and biosensors that can be used for chemical, biological, and biomedical applications, including but not limited to temperature sensors, strain sensor, light sensors, spectrophotometric sensors, pulse oximeter, optical fiber probes, fluorescence sensors, pH sensor, ion-selective electrodes, piezoelectric sensors, glucose sensors, DNA and immunosensors, lab-on-a-chip biosensors, paper-based lab-on-a-chip biosensors, and microcontroller-based sensors. The author treats the study of biosensors with an applications-based approach, including over 15 extensive, hands-on labs given at the end of each chapter. The material is presented using a building-block approach, beginning with the fundamentals of sensor design and temperature sensors, and ending with more complicated biosensors.New to this second edition are sections on op-amp filters, pulse oximetry, meat quality monitoring, advanced fluorescent dyes, autofluorescence, various fluorescence detection methods, fluoride ion-selective electrode, advanced glucose sensing methods including continuous glucose monitoring, paper-based lab-on-a-chip, etc. A new chapter on nano-biosensors and an appendix on microcontrollers make this textbook ideal for undergraduate engineering students studying biosensors. It can also serve as a hands-on guide for scientists and engineers working in the sensor or biosensor industries.
This two-volume set LNBI 10813 and LNBI 10814 constitutes the proceedings of the 6th International Work-Conference on Bioinformatics and Biomedical Engineering, IWBBIO 2018, held in Granada, Spain, in April 2018.The 88 regular papers presented were carefully reviewed and selected from 273 submissions. The scope of the conference spans the following areas: bioinformatics for healthcare and diseases; bioinformatics tools to integrate omics dataset and address biological question; challenges and advances in measurement and self-parametrization of complex biological systems; computational genomics; computational proteomics; computational systems for modelling biological processes; drug delivery system design aided by mathematical modelling and experiments; generation, management and biological insights from big data; high-throughput bioinformatic tools for medical genomics; next generation sequencing and sequence analysis; interpretable models in biomedicine and bioinformatics; little-big data. Reducing the complexity and facing uncertainty of highly underdetermined phenotype prediction problems; biomedical engineering; biomedical image analysis; biomedical signal analysis; challenges in smart and wearable sensor design for mobile health; and healthcare and diseases.
This two volume set LNBI 10813 and LNBI 10814 constitutes the proceedings of the 6th International Work-Conference on Bioinformatics and Biomedical Engineering, IWBBIO 2018, held in Granada, Spain, in April 2018.The 88 regular papers presented were carefully reviewed and selected from 273 submissions. The scope of the conference spans the following areas: bioinformatics for healthcare and diseases; bioinformatics tools to integrate omics dataset and address biological question; challenges and advances in measurement and self-parametrization of complex biological systems; computational genomics; computational proteomics; computational systems for modelling biological processes; drug delivery system design aided by mathematical modelling and experiments; generation, management and biological insights from big data; high-throughput bioinformatic tools for medical genomics; next generation sequencing and sequence analysis; interpretable models in biomedicine and bioinformatics; little-big data. Reducing the complexity and facing uncertainty of highly underdetermined phenotype prediction problems; biomedical engineering; biomedical image analysis; biomedical signal analysis; challenges in smart and wearable sensor design for mobile health; and healthcare and diseases.
The application of bioinformatics approaches in drug design involves an interdisciplinary array of sophisticated techniques and software tools to elucidate hidden or complex biological data. This work reviews the latest bioinformatics approaches used for drug discovery. The text covers ligand-based and structure-based approaches for computer-aided drug design, 3D pharmacophore modeling, molecular dynamics simulation, the thermodynamics of ligand receptor and ligand enzyme association, thermodynamic characterization and optimization, and techniques for computational genomics and proteomics.
This book provides a complete overview of significant design challenges in respect to circuit miniaturization and power reduction of the neural recording system, along with circuit topologies, architecture trends, and (post-silicon) circuit optimization algorithms. The introduced novel circuits for signal conditioning, quantization, and classification, as well as system configurations focus on optimized power-per-area performance, from the spatial resolution (i.e. number of channels), feasible wireless data bandwidth and information quality to the delivered power of implantable system.
This textbook on practical data analytics unites fundamental principles, algorithms, and data. Algorithms are the keystone of data analytics and the focal point of this textbook. Clear and intuitive explanations of the mathematical and statistical foundations make the algorithms transparent. But practical data analytics requires more than just the foundations. Problems and data are enormously variable and only the most elementary of algorithms can be used without modification. Programming fluency and experience with real and challenging data is indispensable and so the reader is immersed in Python and R and real data analysis. By the end of the book, the reader will have gained the ability to adapt algorithms to new problems and carry out innovative analyses. This book has three parts:(a) Data Reduction: Begins with the concepts of data reduction, data maps, and information extraction. The second chapter introduces associative statistics, the mathematical foundation of scalable algorithms and distributed computing. Practical aspects of distributed computing is the subject of the Hadoop and MapReduce chapter.(b) Extracting Information from Data: Linear regression and data visualization are the principal topics of Part II. The authors dedicate a chapter to the critical domain of Healthcare Analytics for an extended example of practical data analytics. The algorithms and analytics will be of much interest to practitioners interested in utilizing the large and unwieldly data sets of the Centers for Disease Control and Prevention's Behavioral Risk Factor Surveillance System.(c) Predictive Analytics Two foundational and widely used algorithms, k-nearest neighbors and naive Bayes, are developed in detail. A chapter is dedicated to forecasting. The last chapter focuses on streaming data and uses publicly accessible data streams originating from the Twitter API and the NASDAQ stock market in the tutorials. This book is intended for a one- or two-semester course in data analytics for upper-division undergraduate and graduate students in mathematics, statistics, and computer science. The prerequisites are kept low, and students with one or two courses in probability or statistics, an exposure to vectors and matrices, and a programming course will have no difficulty. The core material of every chapter is accessible to all with these prerequisites. The chapters often expand at the close with innovations of interest to practitioners of data science. Each chapter includes exercises of varying levels of difficulty. The text is eminently suitable for self-study and an exceptional resource for practitioners.
This comprehensive book focuses on better big-data security for healthcare organizations. Following an extensive introduction to the Internet of Things (IoT) in healthcare including challenging topics and scenarios, it offers an in-depth analysis of medical body area networks with the 5th generation of IoT communication technology along with its nanotechnology. It also describes a novel strategic framework and computationally intelligent model to measure possible security vulnerabilities in the context of e-health. Moreover, the book addresses healthcare systems that handle large volumes of data driven by patients' records and health/personal information, including big-data-based knowledge management systems to support clinical decisions. Several of the issues faced in storing/processing big data are presented along with the available tools, technologies and algorithms to deal with those problems as well as a case study in healthcare analytics. Addressing trust, privacy, and security issues as well as the IoT and big-data challenges, the book highlights the advances in the field to guide engineers developing different IoT devices and evaluating the performance of different IoT techniques. Additionally, it explores the impact of such technologies on public, private, community, and hybrid scenarios in healthcare. This book offers professionals, scientists and engineers the latest technologies, techniques, and strategies for IoT and big data.
Emerging Practices in Telehealth: Best Practices in a Rapidly Changing Field is an introduction to telehealth basics, best practices and implementation methods. The book guides the reader from start to finish through the workflow implementation of telehealth technology, including EMRs, clinical workflows, RPM, billing systems, and patient experience. It also explores how telehealth can increase healthcare access and decrease disparities across the globe. Practicing clinicians, medical fellows, allied healthcare professionals, hospital administrators, and hospital IT professionals will all benefit from this practical guidebook.
Rapid advancement of telecommunications and information technology has created the potential for high-quality expert healthcare to be delivered when and where it is needed. This text charts the development of the telemedicine industry, defines its current scope and reveals the potential of new methodologies.
Covering both physical as well as mathematical and algorithmic foundations, this graduate textbook provides the reader with an introduction into modern biomedical imaging and image processing and reconstruction. These techniques are not only based on advanced instrumentation for image acquisition, but equally on new developments in image processing and reconstruction to extract relevant information from recorded data. To this end, the present book offers a quantitative treatise of radiography, computed tomography, and medical physics. Contents Introduction Digital image processing Essentials of medical x-ray physics Tomography Radiobiology, radiotherapy, and radiation protection Phase contrast radiography Object reconstruction under nonideal conditions
Monika Futschik introduces an evaluation model that allows a holistic assessment of the advantages and disadvantages of electronic batch recording solutions versus traditional paper batch ticket solutions. In comparison to former studies, this newly developed evaluation model considers the change management efforts and the financial investments required for system deployment. The model proves the overall performance value through the implementation of electronic batch recording solutions and supports decision-makers in finding the most effective solution. The development and effectiveness of this model is based on various surveys, expert interviews, a Delphi study as well as a case study with a real-life pharmaceutical company. The outcome of her research can be easily applied to other industries as well. |
You may like...
Emerging Advancements for Virtual and…
Luis Coelho, Ricardo Queiros, …
Hardcover
R8,635
Discovery Miles 86 350
Intelligent Materials for Controlled…
Steven M Dinh, John DeNuzzio, …
Hardcover
R2,349
Discovery Miles 23 490
Advanced Introduction to Artificial…
Tom Davenport, John Glaser, …
Paperback
R602
Discovery Miles 6 020
Frontiers in Molecular Design and…
Rachelle J. Bienstock, Veerabahu Shanmugasundaram, …
Hardcover
R4,914
Discovery Miles 49 140
Cases on Virtual Reality Modelling in…
Yuk Ming Tang, Ho Ho Lun, …
Hardcover
R10,065
Discovery Miles 100 650
P5 EHealth - An Agenda for the Health…
Gabriella Pravettoni, Stefano Triberti
Hardcover
R1,390
Discovery Miles 13 900
|