Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Medicine > General issues > Medical equipment & techniques > General
This book constitutes the refereed proceedings of the Third International KR4HC 2011 workshop held in conjunction with the 13th Conference on Artificial Intelligence in medicine, AIME 2011, in Bled, Slovenia, in July 2011. The 11 extended papers presented together with 1 invited paper were carefully reviewed and selected from 22 submissions. The papers cover topics like health care knowledge sharing; health process; clinical practice guidelines; and patient records, ontologies, medical costs, and clinical trials.
Increasingly more computer applications are becoming available to assist mental health clinicians and administrators in patient evaluation and treatment and mental health management, education, and research. Topics covered include: automated assessment procedures; MR-E (The Mental Retardation Expert); computerized assessment system for psychotherapy evaluation and research; computer assisted therapy of stress related conditions; computerized patient evaluation in a clinical setting; computerized treatment planning; the VA national mental health database; networks; managed care; DSM-IV diagnosis; quality management; cost control; knowledge coupling; telemedicine; the clinical library assistant; and monitoring independent service providers.
A comprehensive introduction to ICA for students and practitioners Independent Component Analysis (ICA) is one of the most exciting new topics in fields such as neural networks, advanced statistics, and signal processing. This is the first book to provide a comprehensive introduction to this new technique complete with the fundamental mathematical background needed to understand and utilize it. It offers a general overview of the basics of ICA, important solutions and algorithms, and in-depth coverage of new applications in image processing, telecommunications, audio signal processing, and more. Independent Component Analysis is divided into four sections that cover:
Authors Hyvärinen, Karhunen, and Oja are well known for their contributions to the development of ICA and here cover all the relevant theory, new algorithms, and applications in various fields. Researchers, students, and practitioners from a variety of disciplines will find this accessible volume both helpful and informative.
As its name implies, this book deals with clinical information systems. The clinical information system (or CIS) is an automated system with a long term database containing clinical information used for patient care. This definition excludes business systems (no clinical data), physiological monitoring systems (no long term database), and many research systems (not used in patient care). The theses of this book are (a) that CIS technology is mature, (b) that the CIS will have a major impact upon patient care and the health delivery system, and (c) that the number of commercial systems which now offer these potential benefits is very small. The objective of this book is to establish the above theses and thereby (a) inform both users and developers, (b) increase the demand for more sophisticated products, and finally, (c) provide marketplace incentives to advance the state of the art. The CIS is an application of computer technology for a specific class of problems. Its development requires a knowledge of the technology with an understanding of the application area. As with any tool-based application, the scope of the product will be limited by the capability of the tool. In the case of the CIS, reliable computers with comprehensive database facilities became com mercially available in the early 1970s. By the mid 1970s there was a maturation of the literature, and evaluations of 5-years' use began to appear. As will be shown, there have been surprisingly few new ideas introduced since the 1970s."
The book collects the contributions to the NATO Advanced Study Institute on "Speech Recognition and Understanding: Recent Advances, Trends and Applications", held in Cetraro, Italy, during the first two weeks of July 1990. This Institute focused on three topics that are considered of particular interest and rich of i'p.novation by researchers in the fields of speech recognition and understanding: Advances in Hidden Markov modeling, connectionist approaches to speech and language modeling, and linguistic processing including language and dialogue modeling. The purpose of any ASI is that of encouraging scientific communications between researchers of NATO countries through advanced tutorials and presentations: excellent tutorials were offered by invited speakers that present in this book 15 papers which sum marize or detail the topics covered in their lectures. The lectures were complemented by discussions, panel sections and by the presentation of related works carried on by some of the attending researchers: these presentations have been collected in 42 short contributions to the Proceedings. This volume, that the reader can find useful for an overview, although incomplete, of the state of the art in speech understanding, is divided into 6 Parts.
The NATO sponsored Advanced Study Institute 'The Biology and Tech nology of Intelligent Autonomous Agents' was an extraordinary event. For two weeks it brought together the leading proponents of the new behavior oriented approach to Artificial Intelligence in Castel Ivano near Trento. The goal of the meeting was to establish a solid scientific and technological foun dation for the field of intelligent autonomous agents with a bias towards the new methodologies and techniques that have recently been developed in Ar tificial Intelligence under the strong influence of biology. Major themes of the conference were: bottom-up AI research, artificial life, neural networks and techniques of emergent functionality. The meeting was such an extraordinary event because it not only featured very high quality lectures on autonomous agents and the various fields feeding it, but also robot laboratories which were set up by the MIT AI laboratory (with a lab led by Rodney Brooks) and the VUB AI laboratory (with labs led by Tim Smithers and Luc Steels). This way the participants could also gain practical experience and discuss in concreto what the difficulties and achievements were of different approaches. In fact, the meeting has been such a success that a follow up meeting is planned for September 1995 in Monte Verita (Switzerland). This meeting is organised by Rolf Pfeifer (University of Zurich)."
Designed to assist the physician in the application of computers in private medical practice, this comprehensive guide outlines where, why, and how this valuable tool can best be used. Integrating the mechanisms of computerization with the implications for health care, the authors draw on personal research and experience to describe models used effectively in the medical setting. Chapters cover administrative procedures, applications for marketing and quality assurance, and the link to an office-hospital application. Aslo included is information on software, hardware, database management, expert systems, artificial intelligence, and indications of future trends. This work will serve as an essential reference in meeting the ever-increasing medical information needs of the private practitioner.
Computer technology has impacted the practice of medicine in dramatic ways. Imaging techniques provide noninvasive tools which alter the diagnostic process. Sophisticated monitoring equipment presents new levels of detail for both patient management and research. In most of these technology applications, the com puter is embedded in the device; its presence is transparent to the user. There is also a growing number of applications in which the health care provider directly interacts with a computer. In many cases, these applications are limited to administrative functions, e.g., office practice management, location of hospital patients, appointments, and scheduling. Nevertheless, there also are instances of patient care functions such as results reporting, decision support, surveillance, and reminders. This series, Computers and Medicine, will focus upon the direct use of infor mation systems as it relates to the medical community. After twenty-five years of experimentation and experience, there are many tested applications which can be implemented economically using the current generation of computers. More over, the falling cost of computers suggests that there will be even more extensive use in the near future. Yet there is a gap between current practice and the state-of the-art."
Incentives for innovation are particularly relevant in the pharmaceutical industry where not all social needs provide equally profitable opportunities and where most OECD countries try to implement different measures that promote research in these less profitable areas. This book describes how incentives can be provided to deal with less profitable activities when no clear markets exist for the innovations. The book discusses alternative mechanisms to substitute for inexistent markets, situations where traditional instruments have proven totally insufficient, and the clear mismatch between the size of the markets being targeted and the incentives being provided. Patents become an ineffective way to incentivise R&D when the appropriability is low; this book provides alternative ideas such as allowing for a period of data exclusivity to firms that develop new drugs.
The contents of this volume derive loosely from an EMBO worksh9P held at EMBL (Heidelberg) towards the end of 1989. The topic of Patterns in Protein Sequence and Structure attracted a wide range of participants, from biochemists to computer scientists, and that diversity has, to some extent, remained in the contributions to this volume. The problems of interpreting biological sequence data are to an increasing extent forcing molecular biologists to learn the language of computers, including at times, even the abstruse language of the computer scientists themselves. While, on their side, the computer scientists have discovered a veritable honey-pot of real data on which to test their algorithms. This enforced meeting of two otherwise alien fields has resulted in some difficulties in communication and it was an aim of the EMBO workshop to help resolve these. By the end, most biologists at the meeting had, at least, heard the terms Dynamic Programming and Regular Expression while for their part the computer programmers began to realise that protein sequences might be more than simple Markov chains in a 20-letter alphabet. Thanks to the modern facilities at EMBL, the three day meeting was video-taped and from this a transcript was taken and offered to the speakers as the basis for a contribution to this volume.
The amount of molecular information is too vast to be acquired without the use of computer-bases systems. The authors introduce students entering research in molecular biology and related fields into the efficient use of the numerous databases available. They show the broad scientific context of these databases and their latest developments. They also put the biological, chemical and computational aspects of structural information on biomolecules into perspective. The book is required reading for researchers and students who plan to use modern computer environment in their research.
This monograph series is intended to provide medical information scien tists, health care administrators, health care providers, and computer sci ence professionals with successful examples and experiences of computer applications in health care settings. Through the exposition of these com puter applications, we attempt to show what is effective and efficient and hopefully provide some guidance on the acquisition or design of informa tion systems so that costly mistakes can be avoided. The health care industry is currently being pushed and pulled from all directions - from the clinical side to increase quality of care, from the busi ness side to improve financial stability, from the legal and regulatory sides to provide more detailed documentation, and, in a university environment, to provide more data for research and improved opportunities for educa tion. Medical information systems sit in the middle of all these demands. They are not only asked to provide more, better, and more timely informa tion but also to interact with and monitor the process of health care itself by providing clinical reminders, warnings about adverse drug interactions, alerts to questionable treatment, alarms for security breaches, mail mes sages, workload schedules, etc. Clearly, medical information systems are functionally very rich and demand quick response time and a high level of security. They can be classified as very complex systems and, from a devel oper's perspective, as 'risky' systems."
The NATO workshop on Disordered Systems and Biological Organization was attended, in march 1985, by 65 scientists representing a large variety of fields: Mathematics, Computer Science, Physics and Biology. It was the purpose of this interdisciplinary workshop to shed light on the conceptual connections existing between fields of research apparently as different as: automata theory, combinatorial optimization, spin glasses and modeling of biological systems, all of them concerned with the global organization of complex systems, locally interconnected. Common to many contributions to this volume is the underlying analogy between biological systems and spin glasses: they share the same properties of stability and diversity. This is the case for instance of primary sequences of biopo Iymers I ike proteins and nucleic acids considered as the result of mutation-selection processes [P. W. Anderson, 1983] or of evolving biological species [G. Weisbuch, 1984]. Some of the most striking aspects of our cognitive apparatus, involved In learning and recognttlon [J. Hopfield, 19821, can also be described in terms of stability and diversity in a suitable configuration space. These interpretations and preoccupations merge with those of theoretical biologists like S. Kauffman [1969] (genetic networks) and of mathematicians of automata theory: the dynamics of networks of automata can be interpreted in terms of organization of a system in multiple possible attractors. The present introduction outlInes the relationships between the contributions presented at the workshop and brIefly discusses each paper in its particular scientific context.
Medical imaging is a very important area in diagnostic (and increasingly therapeutic) medicine. Many new techniques are being developed or extended which depend on digital methods. Although conventional x-radiographs still comprise the bulk of the medical images acquired in a hospital, digital methods such as computerized tomography and magnetic resonance imaging are now often claimed to have a more significant clinical impact. This book is concerned with three aspects of such digital images: their formation, or how they can be acquired; their handling, or how they may be manipulated to increase their clinical value; and their evaluation, or how their impact and value may be assessed. The book is divided into three parts. Part 1 comprises a series of reviews in the general subject area written by authorities in the field. Part 2 includes papers on theoretical aspects: 3D images, reconstruction, perception, and image processing. Part 3includes papers on applications in nuclear medicine, magnetic resonance, andradiology.
Bioinformatics can be loosely defined as the collection, classification, storage, and analysis of biochemical and biological information using computers and mathematical algorithms. Although no single person or group started the field wholly on their own, Temple Smith, Ph.D., a professor at Boston University, is generally credited with coining the term. Bioinformatics represents a marriage of biology, medicine, computer science, physics, and mathematics, fields of study that have historically existed as mutually exclusive disciplines. Concurrently, bioinformatics has vaulted into the public s eye in lay newspapers and magazines, most notably in the area of (personalized) DNA sequencing. The combined result is that bioinformatics is being heralded as a panacea to the current limitations in the clinical management of cancer. While certainly over optimistic in some regards, this designation is not without promise particularly in the area of cancer diagnosis and prognosis. The focus of this book is to: i) to provide a historical and technical perspective on the analytical techniques, methodologies, and platforms used in bioinformatics experiments, ii) to show how a bioinformatics approach has been used to characterize various cancer-related processes, and iii) to demonstrate how a bioinformatics approach is being used to bridge basic science and the clinical arena to positively impact patient care and management."
User models have recently attracted much research interest in the field of artificial intelligence dialog systems. It has become evident that a flexible user-oriented dialog behavior of such systems can be realized only if the system disposes of a model of the user, containing assumptions about the users background knowledge as well as the users goals and plans in consulting the system. Research in the field of user models investigates how such assumptions can be automatically created, represented and exploited by the system in the course of an interaction with the user. This volume is the first survey pertaining to the field of user modeling. Most of the prominent international researchers in this area have contributed to this volume. Their papers are grouped into four sections: The introductory section contains a general view of the field as a whole, and a number of surveys of specific problems and techniques in user modeling. Sections two and three describe eight user modeling systems, with the focus lying on the automatic creation and exploitation of assumptions about the user respectively. The final section discusses several limits of current systems, and proposes solutions as to how some of the shortcomings might be overcome. In order to increase the quality and the coherency of the volume, each paper has been reviewed by all other contributors. Cross-references have been integrated wherever appropriate. All contributions are introduced in editorial prefaces pertaining to each section. A subject index and an extensive bibliography supplement the book.
The visualization of human anatomy for diagnostic, therapeutic, and educational pur poses has long been a challenge for scientists and artists. In vivo medical imaging could not be introduced until the discovery of X-rays by Wilhelm Conrad ROntgen in 1895. With the early medical imaging techniques which are still in use today, the three-dimensional reality of the human body can only be visualized in two-dimensional projections or cross-sections. Recently, biomedical engineering and computer science have begun to offer the potential of producing natural three-dimensional views of the human anatomy of living subjects. For a broad application of such technology, many scientific and engineering problems still have to be solved. In order to stimulate progress, the NATO Advanced Research Workshop in Travemiinde, West Germany, from June 25 to 29 was organized. It brought together approximately 50 experts in 3D-medical imaging from allover the world. Among the list of topics image acquisition was addressed first, since its quality decisively influences the quality of the 3D-images. For 3D-image generation - in distinction to 2D imaging - a decision has to be made as to which objects contained in the data set are to be visualized. Therefore special emphasis was laid on methods of object definition. For the final visualization of the segmented objects a large variety of visualization algorithms have been proposed in the past. The meeting assessed these techniques.
Computer technology has impacted the practice of medicine in dramatic ways. Imaging techniques provide noninvasive tools which alter the diag nostic process. Sophisticated monitoring equipment presents new levels of detail for both patient management and research. In most of these high technology applications, the computer is embedded in the device; its presence is transparent to the user. There is also a growing number of applications in which the health care provider directly interacts with a computer. In many cases, these applica tions are limited to administrative functions, e.g., office practice manage ment, location of hospital patients, appointments, and scheduling. Nev ertheless, there also are instances of patient care functions such as results reporting, decision support, surveillance, and reminders. This series, Computers and Medicine, focuses upon the direct use of information systems as it relates to the medical community. After twenty five years of experimentation and experience, there are many tested ap plications which can be implemented economically using the current gen eration of computers. Moreover, the falling cost of computers suggests that there will be even more extensive use in the near future. Yet there is a gap between current practice and the state-of-the-art."
Dentistry today is changing because of new knowledge networks based on electronic technology. This book tells practitioners, administrators and educators what is happening in dentistry and how to use the full potential of new information technologies. Specifics such as existing machines, operating systems, software packages and user support groups are discussed. Aspects of standards for storage, access, and the use of information as well as its integration into the dental practice are covered. More general topics cover the impact of dental informatics on dentistry training programs, the dental manufacturing industries and insurance.
This volume is based on lectures held at the NATO Advanced Study Institute on Multiple Criteria Decision Making and Risk Analysis Using Microcomputers that took place in Istanbul, Turkey from June 28 to July 8, 1987. The book considers aspects of multiple criteria decision making and risk analysis, with numerous methods and applications using microcomputers. The methodology included is fairly representative of the field. It covers the Analytical Hierarchy Process of Saaty, the approaches of Zionts and Wallenius and their colleagues, and the work of Dyer, Steuer, and Yu. Important behavioral considerations in decision making, psychological aspects of judgement and choice, and preference elicitation are discussed, followed by a number of applications of methods in various fields including hospital diagnostics systems and production planning. Special emphasis is placed on the growing importance of computer graphics in multiple criteria models. The approach of Korhonen and Wallenius in "A Pareto Race" is presented as a decision support system. The "Trimap" approach by Climaco, an approach that has some potential for end users, is of particular interest to MCDM researchers.
Computer technology has impacted the practice of medicine in dramatic ways. Imaging techniques provide noninvasive tools which alter the di agnostic process. Sophisticated monitoring equipment presents new levels of detail for both patient management and research. In most of these tech nology applications, the computer is embedded in the device; its presence is transparent to the user. There is also a growing number of applications in which the health care provider directly interacts with a computer. In many cases, these appli cations are limited to administrative functions, e.g., office practice man agement, location of hospital patients, appointments, and scheduling. Nevertheless, there also are instances of patient care functions such as results reporting, decision support, surveillance, and reminders. This series, Computers and Medicine, will focus upon the direct use of information systems as it relates to the medical community. After twenty-five years of experimentation and experience, there are many tested applications which can be implemented economically using the current generation of computers. Moreover, the falling cost of computers suggests that there will be even more extensive use in the near future. Yet there is a gap between current practice and the state-of-the-art."
Nanotechnologies are among the fastest growing areas of scientific research, and this is expected to have a substantial impact on human health care, especially in biomedical applications and nanomedicine now and in the near future. In the present scenario, nanotechnology is spreading its wings to address the key problems in the field of nanomedicine and human health care by improving diagnosis, prevention, treatment, and tissue engineering. This book provides an in-depth investigation of nanotechnology-based therapy and recent advancements in this field for revolutionizing the treatments for various fatal diseases, including cardiovascular and infectious diseases.
Computer technology has impacted the practice of medicine in dramatic ways. Imaging techniques provide non-evasive tools which alter the diagnostic pro cess. Sophisticated monitoring equipment presents new levels of detail for both patient management and research. In most of these high technology applica tions, the computer is embedded in the device; its presence is transparent to the user. There is also a growing number of applications in which the health care provider directly interacts with a computer. In many cases, these applications are limited to administrative functions, e.g., office practice management, loca tion of hospital patients, appointments, and scheduling. Nevertheless, there also are instances of patient care functions such as results reporting, decision support, surveillance, and reminders. This series, Computers and Medicine, will focus upon the direct use of infor mation systems as it relates to the medical community. After twenty-five years of experimentation and experience, there are many tested applications which can be implemented economically using the current generation of computers. Moreover, the falling cost of computers suggests that there will be even more extensive use in the near future. Yet there is a gap between current practice and the state-of-the-art."
This book is the result of several years of enthusiastic planning and effort. Much of this enthusiasm came from the experience of devel{)ping Critical Care Consultant, a large BASIC program for critical care applications (St. Louis, C. V. Mosby, 1985). Working with clinicians showed me that many were interested in learning about clinical applications of computers (and even programming in small doses) but were faced with a paucity of clinical application software. Few had the time or training to develop any such software on their own. After a search through the existing medical literature unearthed relatively little in the way of usable programs, I decided that a series of small clinical applications programs would be of use to the medical community. At the onset a number of strategic decisions were made: (1) the programs would be written in BASIC, in view of its universal popularity, (2) the units used for clinical laboratory tests would be those in common use in the United States, (3) the programs would be simple and easily understood and employ no exotic tricks that were not easily transported across computers, (4) references to the literature would be provided to allow the clinician to critically assess the algorithm or method used himself or to follow up on subsequent criticisms that may have been published, and (5) the programs would demonstrate reasonable standards of software engineering in terms of clarity, trans portability, documentation, and ease of modification."
This book contains a collection of quantitative procedures in common use in pharmacology and related disciplines. It is intended for students and researchers in all fields who work with drugs. Many physicians, especially those concerned with clinical pharmacology, will also find much that is useful. The procedures included may be considered "core" since they are generally applicable to all classes of drugs. Some of the procedures deal with statistics and, hence, have even wider application. In this new edition we have increased the number of procedures from 33 (in the first edition) to 48. Other procedures have been revised and expanded. Yet the basic philosophy of this new edition remains unchanged from the first. That is, the pharmacologic basis of each procedure is presented, along with the necessary formulas and one or more worked examples. An associated computer program is included for each procedure and its use is illustrated with the same worked example used in the text. The discussions of theory and the sample computations are brief and self-contained, so that all computations can be made with the aid of a pocket calculator and the statistical tables contained in Appendix A. Yet it is realized that the proliferation of lower-priced microcom puters is likely to mean that more and more readers will utilize a computer for most calculations. Accordingly, we have modified the format of the book to facilitate computer usage." |
You may like...
Controlling Epidemics With Mathematical…
Abraham Varghese, Eduardo M. Lacap, Jr., …
Hardcover
R7,214
Discovery Miles 72 140
PDE Modeling and Boundary Control for…
Zhijie Liu, Jinkun Liu
Hardcover
R2,861
Discovery Miles 28 610
Computational and Experimental…
Satya N Atluri, Igor Vusanovic
Hardcover
R5,670
Discovery Miles 56 700
Polyhedral Methods in Geosciences
Daniele Antonio Di Pietro, Luca Formaggia, …
Hardcover
R4,053
Discovery Miles 40 530
Programming for Computations - Python…
Svein Linge, Hans Petter Langtangen
Hardcover
R1,606
Discovery Miles 16 060
Computational Diffusion MRI - MICCAI…
Elisenda Bonet-Carne, Jana Hutter, …
Hardcover
R4,348
Discovery Miles 43 480
The Dynamics of Biological Systems
Arianna Bianchi, Thomas Hillen, …
Hardcover
Mathematics and Computing - ICMC 2018…
Debdas Ghosh, Debasis Giri, …
Hardcover
R2,949
Discovery Miles 29 490
|