Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Promotions > Women In Power > Books
Autonomous sensors transmit data and power their electronics without using cables. They can be found in e.g. wireless sensor networks (WSNs) or remote acquisition systems. Although primary batteries provide a simple design for powering autonomous sensors, they present several limitations such as limited capacity and power density, and difficulty in predicting their condition and state of charge. An alternative is to extract energy from the ambient (energy harvesting). However, the reduced dimensions of most autonomous sensors lead to a low level of available power from the energy transducer. Thus, efficient methods and circuits to manage and gather the energy are a must. An integral approach for powering autonomous sensors by considering both primary batteries and energy harvesters is presented. Two rather different forms of energy harvesting are also dealt with: optical (or solar) and radiofrequency (RF). Optical energy provides high energy density, especially outdoors, whereas RF remote powering is possibly the most feasible option for autonomous sensors embedded into the soil or within structures. Throughout different chapters, devices such as primary and secondary batteries, supercapacitors, and energy transducers are extensively reviewed. Then, circuits and methods found in the literature used to efficiently extract and gather the energy are presented. Finally, new proposals based on the authors' own research are analyzed and tested. Every chapter is written to be rather independent, with each incorporating the relevant literature references. Powering Autonomous Sensors is intended for a wide audience working on or interested in the powering of autonomous sensors. Researchers and engineers can find a broad introduction to basic topics in this interesting and emerging area as well as further insights on the topics of solar and RF harvesting and of circuits and methods to maximize the power extracted from energy transducers.
Mitochondria are essential organelles in eukaryotic cells that control such diverse processes as energy metabolism, calcium buffering, and cell death. Recent studies have revealed that changes in mitochondrial morphology by fission and fusion, a process known as mitochondrial dynamics, is particularly important for neuronal function and survival. Defects in this process are commonly found in neurodegenerative diseases, offering a new paradigm for investigating mechanisms of neurodegeneration. To provide researchers working on neurodegenerative diseases and mitochondria with updated information on this rapidly progressing field, we have invited experts in the field to critically review recent progresses and identify future research directions. The topics include genetics of mitochondrial dynamics, mitochondrial dynamics and bioenergetics, autophagy, apoptosis, and axonal transport, and its role in neurological diseases, including Alzheimer's, Parkinson's, and Huntington's diseases.
Grids, P2P and Services Computing, the 12th volume of the CoreGRID series, is based on the CoreGrid ERCIM Working Group Workshop on Grids, P2P and Service Computing in Conjunction with EuroPar 2009. The workshop will take place August 24th, 2009 in Delft, The Netherlands. Grids, P2P and Services Computing, an edited volume contributed by well-established researchers worldwide, will focus on solving research challenges for Grid and P2P technologies. Topics of interest include: Service Level Agreement, Data & Knowledge Management, Scheduling, Trust and Security, Network Monitoring and more. Grids are a crucial enabling technology for scientific and industrial development. This book also includes new challenges related to service-oriented infrastructures. Grids, P2P and Services Computing is designed for a professional audience composed of researchers and practitioners within the Grid community industry. This volume is also suitable for advanced-level students in computer science.
What role should regulation play in financial markets? What have been the ramifications of financial regulation? To answer these and other questions regarding the efficacy of legislation on financial markets, this book examines the impact of the Gramm Leach Bliley Act (GLBA), also called the Financial Modernization Act of 1999, which fundamentally changed the financial landscape in the United States. The GLBA allows the formation of financial holding companies that can offer an integrated set of commercial banking, securities and insurance products. The tenth anniversary of the most sweeping financial legislation reform in the industry's structure is a natural benchmark for assessing the effects of the law and for questioning whether changes are necessary in the working of this historic legislation. The importance of this review is reinforced by a variety of proposals in the last several years to reform the regulation of financial institutions that have attracted considerable attention among regulators and in the financial firms that they regulate. Most recently, the financial crisis and the failure of some large financial institutions have called into question the legitimacy of America's current financial structure and its regulation, including to some degree the GLBA. There is no doubt that regulatory reform is front and center on today's policy agenda. The lessons of the GLBA experience and its effects, both domestic and international, on financial markets and competitiveness, risk-taking and risk management by financial services firms and their regulators will be critical to the direction the country takes and the effort to ensure that future financial crises do not occur or have less costly damage. With contributions from academics, policy experts, and a sponsor of the GLBA, Congressman James Leach, this book is invaluable to anyone interested in financial system reform.
The volume presents a selection of in-depth studies and state-of-the-art surveys of several challenging topics that are at the forefront of modern applied mathematics, mathematical modeling, and computational science. These three areas represent the foundation upon which the methodology of mathematical modeling and computational experiment is built as a ubiquitous tool in all areas of mathematical applications. This book covers both fundamental and applied research, ranging from studies of elliptic curves over finite fields with their applications to cryptography, to dynamic blocking problems, to random matrix theory with its innovative applications. The book provides the reader with state-of-the-art achievements in the development and application of new theories at the interface of applied mathematics, modeling, and computational science. This book aims at fostering interdisciplinary collaborations required to meet the modern challenges of applied mathematics, modeling, and computational science. At the same time, the contributions combine rigorous mathematical and computational procedures and examples from applications ranging from engineering to life sciences, providing a rich ground for graduate student projects.
Reaction-diffusion and excitable media are amongst most intriguing substrates. Despite apparent simplicity of the physical processes involved the media exhibit a wide range of amazing patterns: from target and spiral waves to travelling localisations and stationary breathing patterns. These media are at the heart of most natural processes, including morphogenesis of living beings, geological formations, nervous and muscular activity, and socio-economic developments. This book explores a minimalist paradigm of studying reaction-diffusion and excitable media using locally-connected networks of finite-state machines: cellular automata and automata on proximity graphs. Cellular automata are marvellous objects per se because they show us how to generate and manage complexity using very simple rules of dynamical transitions. When combined with the reaction-diffusion paradigm the cellular automata become an essential user-friendly tool for modelling natural systems and designing future and emergent computing architectures. The book brings together hot topics of non-linear sciences, complexity, and future and emergent computing. It shows how to discover propagating localisation and perform computation with them in very simple two-dimensional automaton models. Paradigms, models and implementations presented in the book strengthen the theoretical foundations in the area for future and emergent computing and lay key stones towards physical embodied information processing systems.
This book constitutes the refereed proceedings of the 13th International Conference on Entertainment Computing, ICEC 2014, held in Sydney, Australia, in October 2013. The 20 full papers, 6 short papers and 8 posters presented were carefully reviewed and selected from 62 submissions. In addition to these papers, the program featured 3 demonstration papers, and 2 workshops. The papers cover various aspects of entertainment computing including authoring, development, use and evaluation of digital entertainment artefacts and processes.
The book investigates the EU preferential trade policy and, in particular, the impact it had on trade flows from developing countries. It shows that the capability of the "trade as aid" model to deliver its expected benefits to these countries crucially differs between preferential schemes and sectors. The book takes an eclectic but rigorous approach to the econometric analysis by combining different specifications of the gravity model. An in-depth presentation of the gravity model is also included, providing significant insights into the distinctive features of this technique and its state-of-art implementation. The evidence produced in the book is extensively applied to the analysis of the EU preferential policies with substantial suggestions for future improvement. Additional electronic material to replicate the book's analysis (datasets and Gams and Stata 9.0 routines) can be found in the Extra Materials menu on the website of the book.
The 9th ACIS/IEEE International Conference on Computer Science and Information Science, held in Kaminoyama, Japan on August 18-20 is aimed at bringing together researchers and scientists, businessmen and entrepreneurs, teachers and students to discuss the numerous fields of computer science, and to share ideas and information in a meaningful way. This publication captures 18 of the conference's most promising papers, and we impatiently await the important contributions that we know these authors will bring to the ?eld. In chapter 1, Taewan Gu et al. propose a method of software reliability estimation based on IEEE Std. 1633 which is adaptive in the face of frequent changes to software requirements, and show why the adaptive approach is necessary when software requirements are changed frequently through a case study. In chapter 2, Keisuke Matsuno et al. investigate the capacity of incremental learning in chaotic neural networks, varying both the refractory parameter and the learning parameter with network size. This approach is investigated through simulations, which ?nd that capacity can be increased in greater than direct proportion to size. In chapter 3, Hongwei Zeng and Huaikou Miao extend the classical labeled transition system models to make both abstraction and compositional reasoning applicable to deadlock detection for parallel composition of components, and propose a compositional abstraction re?nement approach.
Design considerations for low-power operations and robustness with respect to variations typically impose contradictory requirements. Low-power design techniques such as voltage scaling, dual-threshold assignment and gate sizing can have large negative impact on parametric yield under process variations. This book focuses on circuit/architectural design techniques for achieving low power operation under parameter variations. We consider both logic and memory design aspects and cover modeling and analysis, as well as design methodology to achieve simultaneously low power and variation tolerance, while minimizing design overhead. This book will discuss current industrial practices and emerging challenges at future technology nodes.
This book is about privacy interests in English tort law. Despite the recent recognition of a misuse of private information tort, English law remains underdeveloped. The presence of gaps in the law can be explained, to some extent, by a failure on the part of courts and legal academics to reflect on the meaning of privacy. Through comparative, critical and historical analysis, this book seeks to refine our understanding of privacy by considering our shared experience of it. To this end, the book draws on the work of Norbert Elias and Karl Popper, among others, and compares the English law of privacy with the highly elaborate German law. In doing so, the book reaches the conclusion that an unfortunate consequence of the way English privacy law has developed is that it gives the impression that justice is only for the rich and famous. If English courts are to ensure equalitarian justice, the book argues that they must reflect on the value of privacy and explore the bounds of legal possibility.
This set of six volumes provides a systematic and standardized description of 23,033 chemical components isolated from 6,926 medicinal plants, collected from 5,535 books/articles published in Chinese and international journals. A chemical structure with stereo-chemistry bonds is provided for each chemical component, in addition to conventional information, such as Chinese and English names, physical and chemical properties. It includes a name list of medicinal plants from which the chemical component was isolated. Furthermore, abundant pharmacological data for nearly 8,000 chemical components are presented, including experimental method, experimental animal, cell type, quantitative data, as well as control compound data. The seven indexes allow for complete cross-indexing. Regardless whether one searches for the molecular formula of a compound, the pharmacological activity of a compound, or the English name of a plant, the information in the book can be retrieved in multiple ways.
Mathematical epidemiology of infectious diseases usually involves describing the flow of individuals between mutually exclusive infection states. One of the key parameters describing the transition from the susceptible to the infected class is the hazard of infection, often referred to as the force of infection. The force of infection reflects the degree of contact with potential for transmission between infected and susceptible individuals. The mathematical relation between the force of infection and effective contact patterns is generally assumed to be subjected to the mass action principle, which yields the necessary information to estimate the basic reproduction number, another key parameter in infectious disease epidemiology. It is within this context that the Center for Statistics (CenStat, I-Biostat, Hasselt University) and the Centre for the Evaluation of Vaccination and the Centre for Health Economic Research and Modelling Infectious Diseases (CEV, CHERMID, Vaccine and Infectious Disease Institute, University of Antwerp) have collaborated over the past 15 years. This book demonstrates the past and current research activities of these institutes and can be considered to be a milestone in this collaboration. This book is focused on the application of modern statistical methods and models to estimate infectious disease parameters. We want to provide the readers with software guidance, such as R packages, and with data, as far as they can be made publicly available.
This book describes the implementation of autonomous control with multiagent technology. Therewith, it tackles the challenges of supply network management caused by the complexity, the dynamics, and the distribution of logistics processes. The paradigm of autonomous logistics reduces the computational complexity and copes with the dynamics locally by delegating process control to the participating objects. As an example, shipping containers may themselves plan and schedule their way through logistics networks in accordance with objectives imposed by their owners. The technologies enabling autonomous logistics are thoroughly described and reviewed. The presented solution has been used in a realistic simulation of real-world container logistics processes. The validation shows that autonomous control is feasible and that it outperforms the previous centralised dispatching approach by significantly increasing the resource utilisation efficiency. Moreover, the multiagent system relieves human dispatchers from dealing with standard cases, giving them more time to solve exceptional cases appropriately.
Marsupials belong to the Class Mammalia, sharing some features with other mammals, yet they also possess many unique features. It is their differences from the more traditionally studied mammals, such as mice and humans, that is of greatest value to comparative studies. Sequencing of genomes from two distantly related marsupials, the short grey-tailed opossum from South America and the Australian tammar wallaby, has launched marsupials into the genomics era and accelerated the rate of progress in marsupial research. With the current worldwide concern for the plight of the endangered Tasmanian devil, marsupial genetics and genomics research is even more important than ever if this species is to be saved from extinction. This volume recounts some of the history of research in this field and highlights the most recent advances in the many different areas of marsupial genetics and genomics research.
Evolution equations of hyperbolic or more general p-evolution type form an active field of current research. This volume aims to collect some recent advances in the area in order to allow a quick overview of ongoing research. The contributors are first rate mathematicians. This collection of research papers is centred around parametrix constructions and microlocal analysis; asymptotic constructions of solutions; energy and dispersive estimates; and associated spectral transforms. Applications concerning elasticity and general relativity complement the volume. The book gives an overview of a variety of ongoing current research in the field and, therefore, allows researchers as well as students to grasp new aspects and broaden their understanding of the area.
This book constitutes the refereed proceedings of the 11th Joint Conference on Knowledge-Based Software-Engineering, JCKBSE 2014, held in Volgograd, Russia, in September 2014. The 59 full and 3 short papers presented were carefully reviewed and selected from 197 submissions. The papers are organized in topical sections on methodology and tools for knowledge discovery and data mining; methods and tools for software engineering education; knowledge technologies for semantic web and ontology engineering; knowledge-based methods and tools for testing, verification and validation, maintenance and evolution; natural language processing, image analysis and recognition; knowledge-based methods and applications in information security, robotics and navigation; decision support methods for software engineering; architecture of knowledge-based systems, including intelligent agents and softbots; automating software design and synthesis; knowledge management for business processes, workflows and enterprise modeling; knowledge-based methods and applications in bioscience, medicine and justice; knowledge-based requirements engineering, domain analysis and modeling; intelligent user interfaces and human-machine interaction; lean software engineering; program understanding, programming knowledge, modeling programs and programmers.
The seven-volume set comprising LNCS volumes 8689-8695 constitutes the refereed proceedings of the 13th European Conference on Computer Vision, ECCV 2014, held in Zurich, Switzerland, in September 2014. The 363 revised papers presented were carefully reviewed and selected from 1444 submissions. The papers are organized in topical sections on tracking and activity recognition; recognition; learning and inference; structure from motion and feature matching; computational photography and low-level vision; vision; segmentation and saliency; context and 3D scenes; motion and 3D scene analysis; and poster sessions.
Chemistry for Sustainable Development in Africa gives an insight into current Chemical research in Africa. It is edited and written by distinguished African scientists and includes contributions from Chemists from Northern, Southern, Western, Eastern, Central and Island state African Countries. The core themes embrace the most pressing issues of our time, including Environmental Chemistry, Renewable Energies, Health and Human Well-Being, Food and Nutrition, and Bioprospecting and Commercial Development. This book is invaluable for teaching and research institutes in Africa and worldwide, private sector entities dealing with natural products from Africa, as well as policy and decision-making bodies and non-governmental organizations.
This book analyzes the latest advances in privacy, security and risk technologies within cloud environments. With contributions from leading experts, the text presents both a solid overview of the field and novel, cutting-edge research. A Glossary is also included at the end of the book. Topics and features: considers the various forensic challenges for legal access to data in a cloud computing environment; discusses privacy impact assessments for the cloud, and examines the use of cloud audits to attenuate cloud security problems; reviews conceptual issues, basic requirements and practical suggestions for provisioning dynamically configured access control services in the cloud; proposes scoped invariants as a primitive for analyzing a cloud server for its integrity properties; investigates the applicability of existing controls for mitigating information security risks to cloud computing environments; describes risk management for cloud computing from an enterprise perspective.
Rightshore (R) - a registered trademark of Capgemini - is about organizing the distributed delivery process that embraces on-site, nearshore and offshore services. This book describes successful global delivery models utilizing industrialized methods to deliver SAP (R) projects from India. The first part is devoted to management concepts, service offerings and the peculiarities of working together with India. The second part features eight case studies from different industries and from around the world describing how India delivery centers have been successfully deployed in SAP (R) development projects.
Computational intelligence based techniques have firmly established themselves as viable, alternate, mathematical tools for more than a decade. They have been extensively employed in many systems and application domains, among these signal processing, automatic control, industrial and consumer electronics, robotics, finance, manufacturing systems, electric power systems, and power electronics. Image processing is also an extremely potent area which has attracted the atten tion of many researchers who are interested in the development of new computational intelligence-based techniques and their suitable applications, in both research prob lems and in real-world problems. Part I of the book discusses several image preprocessing algorithms; Part II broadly covers image compression algorithms; Part III demonstrates how computational intelligence-based techniques can be effectively utilized for image analysis purposes; and Part IV shows how pattern recognition, classification and clustering-based techniques can be developed for the purpose of image inferencing. The book offers a unified view of the modern computational intelligence tech niques required to solve real-world problems and it is suitable as a reference for engineers, researchers and graduate students.
This Brief discusses methods to develop and maintain police - researcher partnerships. First, the authors provide information that will be useful to police managers and researchers who are interested in creating and maintaining partnerships to conduct research, work together to improve policing and help others understand the linkages between the two groups. Then, more specifically, they describe how police managers consider and utilize research in policing and criminal justice and its findings from a management perspective in both the United States and Australia. While both countries experience similar issues of trust, acceptance, utility, and accountability between researchers and practitioners, the experiences in the countries differ. In the United States with 17,000 agencies, the use of research findings by police agencies requires understanding, diffusion and acceptance. In Australia with a small number of larger agencies, the problems of research-practitioner partnerships have different translational issues, including acceptance and application. As long as police practitioners and academic researchers hold distinct and different impressions of each other, the likelihood of positive, cooperative, and sustainable agreements between them will suffer.
As the world's economy develops into a more dynamic, fast-moving, and unpredictable entity, it is crucial that the workers who create wealth have the ability to assess and respond to new and unforeseen challenges. In other words, the future will require a more competent workforce. What, though, does this mean in practice? In this, the fully revised second edition of Christine Velde's book, a variety of researchers from around the world provide a truly international perspective on the issue. They help to redefine the term competence. Rather than responding to challenges using a pre-existing set of skills, they see competence as having the ability to assess new situations, and then adapt one's response accordingly, particularly in collaboration with others. Providing the reader with insightful perspectives about competence in different situations and contexts, the book's sections explore the concept of competence in industry and vocational education, in schools and colleges, in small businesses and companies, and in universities. The interpretation, experience and teaching of competence in the workplace is boiled down to five essential components that in themselves represent an argument for a more holistic conception of competence. Velde herself concludes the book by synthesizing and reflecting on the contents. This book provides the reader with insightful perspectives on competence, and the characteristics of learning environments in different workplace contexts. Drawing on phenomenographic insights allows it to present a more enlightened view of competence, at the same time as opening up an international dialogue about the meaning and interpretation of competence in the workplace. Useful not only to educators and researchers, this volume will also assist leaders and managers in a variety of contexts to develop more meaningful workplaces.
Children's Fractional Knowledge elegantly tracks the construction of knowledge, both by children learning new methods of reasoning and by the researchers studying their methods. The book challenges the widely held belief that children's whole number knowledge is a distraction from their learning of fractions by positing that their fractional learning involves reorganizing-not simply using or building upon-their whole number knowledge. This hypothesis is explained in detail using examples of actual grade-schoolers approaching problems in fractions including the schemes they construct to relate parts to a whole, to produce a fraction as a multiple of a unit part, to transform a fraction into a commensurate fraction, or to combine two fractions multiplicatively or additively. These case studies provide a singular journey into children's mathematics experience, which often varies greatly from that of adults. Moreover, the authors' descriptive terms reflect children's quantitative operations, as opposed to adult mathematical phrases rooted in concepts that do not reflect-and which in the classroom may even suppress-youngsters' learning experiences. Highlights of the coverage: Toward a formulation of a mathematics of living instead of being Operations that produce numerical counting schemes Case studies: children's part-whole, partitive, iterative, and other fraction schemes Using the generalized number sequence to produce fraction schemes Redefining school mathematics This fresh perspective is of immediate importance to researchers in mathematics education. With the up-close lens onto mathematical development found in Children's Fractional Knowledge, readers can work toward creating more effective methods for improving young learners' quantitative reasoning skills. |
You may like...
|