![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > General theory of computing
This book features selected papers presented at the 2nd International Conference on Advanced Computing Technologies and Applications, held at SVKM's Dwarkadas J. Sanghvi College of Engineering, Mumbai, India, from 28 to 29 February 2020. Covering recent advances in next-generation computing, the book focuses on recent developments in intelligent computing, such as linguistic computing, statistical computing, data computing and ambient applications.
Technological progress has allowed us to develop devices which make it possible to record blood pressure and heart rate outside the real living conditions. Furthermore, technological progress has provided computer tools to analyse data on a beat-to-beat basis, offering a large body of information on the complex patterns of blood pressure and heart rate variability in health and in disease which has clarified a number of mechanisms responsible for cardiovascular regulation during the day and night. This book aims at providing an updated state of the art on the analysis of blood pressure and heart rate variability from the different perspectives of physiologists, clinicians and engineers by gathering the contributions of experts taken from both the biomedical and the engineering environment. In particular, attention was focused on 1) the methodology of ambulatory blood pressure and heart rate analyses, 2) the mathematical models that can be derived from the data and the implications they have for the understating of normal and deranged cardiovascular control and 3) the present and future use of computer analysis of dynamically collected observations for the diagnosis of cardiovascular diseases.
This thesis deals with topological orders from two different perspectives: from a condensed matter point of view, where topological orders are considered as breakthrough phases of matter; and from the emerging realm of quantum computation, where topological quantum codes are considered the most appealing platform against decoherence. The thesis reports remarkable studies from both sides. It thoroughly investigates a topological order called the double semion model, a counterpart of the Kitaev model but exhibiting richer quasiparticles as excitations. A new model for symmetry enriched topological order is constructed, which adds an onsite global symmetry to the double semion model. Using this topological phase, a new example of topological code is developed, the semion code, which is non-CSS, additive, non-Pauli and within the stabiliser formalism. Furthermore, the thesis analyses the Rashba spin-orbit coupling within topological insulators, turning the helical edge states into generic edges modes with potential application in spinstronics. New types of topological superconductors are proposed and the novel properties of the correspondingly created Majorana fermions are investigated. These Majorana fermions have inherent properties enabling braiding and the performance of logical gates as fundamental blocks for a universsal quantum computator.
System on chips designs have evolved from fairly simple unicore, single memory designs to complex heterogeneous multicore SoC architectures consisting of a large number of IP blocks on the same silicon. To meet high computational demands posed by latest consumer electronic devices, most current systems are based on such paradigm, which represents a real revolution in many aspects in computing. The attraction of multicore processing for power reduction is compelling. By splitting a set of tasks among multiple processor cores, the operating frequency necessary for each core can be reduced, allowing to reduce the voltage on each core. Because dynamic power is proportional to the frequency and to the square of the voltage, we get a big gain, even though we may have more cores running. As more and more cores are integrated into these designs to share the ever increasing processing load, the main challenges lie in efficient memory hierarchy, scalable system interconnect, new programming paradigms, and efficient integration methodology for connecting such heterogeneous cores into a single system capable of leveraging their individual flexibility. Current design methods tend toward mixed HW/SW co-designs targeting multicore systems on-chip for specific applications. To decide on the lowest cost mix of cores, designers must iteratively map the device's functionality to a particular HW/SW partition and target architectures. In addition, to connect the heterogeneous cores, the architecture requires high performance complex communication architectures and efficient communication protocols, such as hierarchical bus, point-to-point connection, or Network-on-Chip. Software development also becomes far more complex due to the difficulties in breaking a single processing task into multiple parts that can be processed separately and then reassembled later. This reflects the fact that certain processor jobs cannot be easily parallelized to run concurrently on multiple processing cores and that load balancing between processing cores - especially heterogeneous cores - is very difficult.
This book offers an in-depth insight into the general-purpose finite element program MSC Marc, which is distributed by MSC Software Corporation. It is a specialized program for nonlinear problems (implicit solver) which is common in academia and industry. The primary goal of this book is to provide a comprehensive introduction to a special feature of this software: the user can write user-subroutines in the programming language Fortran, which is the language of all classical finite element packages. This subroutine feature allows the user to replace certain modules of the core code and to implement new features such as constitutive laws or new elements. Thus, the functionality of commercial codes ('black box') can easily be extended by linking user written code to the main core of the program. This feature allows to take advantage of a commercial software package with the flexibility of a 'semi-open' code.
This book focuseson protocols and constructions that make good use of the building blocks for symmetric cryptography. The book brings under one roof, several esoteric strategies of utilizing symmetric cryptographic blocks. The specific topics addressed by the book include various key distribution strategies for unicast, broadcast and multicast security and strategies for constructing efficient digests of dynamic databases using binary hash trees."
"Managing Data in Motion" describes techniques that have been developed for significantly reducing the complexity of managing system interfaces and enabling scalable architectures. Author April Reeve brings over two decades of experience to present a vendor-neutral approach to moving data between computing environments and systems. Readers will learn the techniques, technologies, and best practices for managing the passage of data between computer systems and integrating disparate data together in an enterprise environment. The average enterprise's computing environment is comprised of hundreds to thousands computer systems that have been built, purchased, and acquired over time. The data from these various systems needs to be integrated for reporting and analysis, shared for business transaction processing, and converted from one format to another when old systems are replaced and new systems are acquired. The management of the "data in motion" in organizations is
rapidly becoming one of the biggest concerns for business and IT
management. Data warehousing and conversion, real-time data
integration, and cloud and "big data" applications are just a few
of the challenges facing organizations and businesses today.
"Managing Data in Motion" tackles these and other topics in a style
easily understood by business and IT managers as well as
programmers and architects.
This book contains an edited selection of the papers accepted for presentation and discussion at the first International Symposium on Qualitative Research (ISQR2016), held in Porto, Portugal, July 12th-14th, 2016. The book and the symposium features the four main application fields Education, Health, Social Sciences and Engineering and Technology and seven main subjects: Rationale and Paradigms of Qualitative Research (theoretical studies, critical reflection about epistemological dimensions, ontological and axiological); Systematization of approaches with Qualitative Studies (literature review, integrating results, aggregation studies, meta -analysis, meta- analysis of qualitative meta- synthesis, meta- ethnography); Qualitative and Mixed Methods Research (emphasis in research processes that build on mixed methodologies but with priority to qualitative approaches); Data Analysis Types (content analysis , discourse analysis , thematic analysis , narrative analysis , etc.); Innovative processes of Qualitative Data Analysis (design analysis, articulation and triangulation of different sources of data - images, audio, video); Qualitative Research in Web Context (eResearch, virtual ethnography, interaction analysis , latent corpus on the internet, etc.); Qualitative Analysis with Support of Specific Software (usability studies, user experience, the impact of software on the quality of research.
In a fragment entitled Elementa Nova Matheseos Universalis (1683?) Leibniz writes "the mathesis [...] shall deliver the method through which things that are conceivable can be exactly determined"; in another fragment he takes the mathesis to be "the science of all things that are conceivable." Leibniz considers all mathematical disciplines as branches of the mathesis and conceives the mathesis as a general science of forms applicable not only to magnitudes but to every object that exists in our imagination, i.e. that is possible at least in principle. As a general science of forms the mathesis investigates possible relations between "arbitrary objects" ("objets quelconques"). It is an abstract theory of combinations and relations among objects whatsoever. In 1810 the mathematician and philosopher Bernard Bolzano published a booklet entitled Contributions to a Better-Grounded Presentation of Mathematics. There is, according to him, a certain objective connection among the truths that are germane to a certain homogeneous field of objects: some truths are the "reasons" ("Grunde") of others, and the latter are "consequences" ("Folgen") of the former. The reason-consequence relation seems to be the counterpart of causality at the level of a relation between true propositions. Arigorous proof is characterized in this context as a proof that shows the reason of the proposition that is to be proven. Requirements imposed on rigorous proofs seem to anticipate normalization results in current proof theory. The contributors of Mathesis Universalis, Computability and Proof, leading experts in the fields of computer science, mathematics, logic and philosophy, show the evolution of these and related ideas exploring topics in proof theory, computability theory, intuitionistic logic, constructivism and reverse mathematics, delving deeply into a contextual examination of the relationship between mathematical rigor and demands for simplification.
This book offers an introduction to applications prompted by tensor analysis, especially by the spectral tensor theory developed in recent years. It covers applications of tensor eigenvalues in multilinear systems, exponential data fitting, tensor complementarity problems, and tensor eigenvalue complementarity problems. It also addresses higher-order diffusion tensor imaging, third-order symmetric and traceless tensors in liquid crystals, piezoelectric tensors, strong ellipticity for elasticity tensors, and higher-order tensors in quantum physics. This book is a valuable reference resource for researchers and graduate students who are interested in applications of tensor eigenvalues.
Since its first volume in 1960, Advances in Computers has
presented detailed coverage of innovations in computer hardware,
software, theory, design, and applications. It has also provided
contributors with a medium in which they can explore their subjects
in greater depth and breadth than journal articles usually allow.
As a result, many articles have become standard references that
continue to be of sugnificant, lasting value in this rapidly
expanding field.
This book investigates the coordinated power management of multi-tenant data centers that account for a large portion of the data center industry. The authors include discussion of their quick growth and their electricity consumption, which has huge economic and environmental impacts. This book covers the various coordinated management solutions in the existing literature focusing on efficiency, sustainability, and demand response aspects. First, the authors provide a background on the multi-tenant data center covering the stake holders, components, power infrastructure, and energy usage. Then, each power management mechanism is described in terms of motivation, problem formulation, challenges and solution.
As technology becomes further meshed into our culture and everyday lives, new mediums and outlets for creative expression and innovation are necessary. The Handbook of Research on Computational Arts and Creative Informatics covers a comprehensive range of topics regarding the interaction of the sciences and the arts. Exploring new uses of technology and investigating creative insights into concepts of art and expression, this cutting-edge Handbook of Research offers a valuable resource to academicians, researchers, and field practitioners.
This book presents the latest findings and ongoing research in the field of environmental informatics. It addresses a wide range of cross-cutting activities, such as efficient computing, virtual reality, disruption management, big data, open science and the internet of things, and showcases how these green information & communication technologies (ICT) can be used to effectively address environmental and societal challenges. Presenting a selection of extended contributions to the 32nd edition of the International Conference EnviroInfo 2018, at the Leibniz Supercomputing Centre in Garching near Munich, it is essential reading for anyone looking to expand their expertise in the area.
This Festschrift is in honor of Marilyn Wolf, on the occasion of her 60th birthday. Prof. Wolf is a renowned researcher and educator in Electrical and Computer Engineering, who has made pioneering contributions in all of the major areas in Embedded, Cyber-Physical, and Internet of Things (IoT) Systems. This book provides a timely collection of contributions that cover important topics related to Smart Cameras, Hardware/Software Co-Design, and Multimedia applications. Embedded systems are everywhere; cyber-physical systems enable monitoring and control of complex physical processes with computers; and IoT technology is of increasing relevance in major application areas, including factory automation, and smart cities. Smart cameras and multimedia technologies introduce novel opportunities and challenges in embedded, cyber-physical and IoT applications. Advanced hardware/software co-design methodologies provide valuable concepts and tools for addressing these challenges. The diverse topics of the chapters in this Festschrift help to reflect the great breadth and depth of Marilyn Wolf's contributions in research and education. The chapters have been written by some of Marilyn's closest collaborators and colleagues.
This book primarily addresses Intelligent Information Systems (IIS) and the integration of artificial intelligence, intelligent systems and technologies, database technologies and information systems methodologies to create the next generation of information systems. It includes original and state-of-the-art research on theoretical and practical advances in IIS, system architectures, tools and techniques, as well as "success stories" in intelligent information systems. Intended as an interdisciplinary forum in which scientists and professionals could share their research results and report on new developments and advances in intelligent information systems, technologies and related areas - as well as their applications - , it offers a valuable resource for researchers and practitioners alike.
Remote/WebCam Notarization hopes to inform, educate, and spark the curiosity of Commissioned Notaries Public and people interested in becoming a Commissioned Notary Public in the United States of America. Its basic understanding has a holistic approach When you read this book, let me know. Find me on Facebook #MyVirginiaNotary And also join https: //www.facebook.com/groups/RemoteWebCamNotarization Other Books by the Author: ""Notary Public Essentials."" ""From Military to Civilian: Transitioning from War and Guns to Online Notarization Specialist.""
The papers in this volume represent a broad, applied swath of advanced contributions to the 2015 ICSA/Graybill Applied Statistics Symposium of the International Chinese Statistical Association, held at Colorado State University in Fort Collins. The contributions cover topics that range from statistical applications in business and finance to applications in clinical trials and biomarker analysis. Each papers was peer-reviewed by at least two referees and also by an editor. The conference was attended by over 400 participants from academia, industry, and government agencies around the world, including from North America, Asia, and Europe.
This edited book presents scientific results of the 15th IEEE/ACIS International Conference on Computer and Information Science (ICIS 2016) which was held on June 26- 29 in Okayama, Japan. The aim of this conference was to bring together researchers and scientists, businessmen and entrepreneurs, teachers, engineers, computer users, and students to discuss the numerous fields of computer science and to share their experiences and exchange new ideas and information in a meaningful way. Research results about all aspects (theory, applications and tools) of computer and information science, and to discuss the practical challenges encountered along the way and the solutions adopted to solve them. The conference organizers selected the best papers from those papers accepted for presentation at the conference. The papers were chosen based on review scores submitted by members of the program committee, and underwent further rigorous rounds of review. This publication captures 12 of the conference's most promising papers, and we impatiently await the important contributions that we know these authors will bring to the field of computer and information science.
This book provides a self-study program on how mathematics, computer science and science can be usefully and seamlessly intertwined. Learning to use ideas from mathematics and computation is essential for understanding approaches to cognitive and biological science. As such the book covers calculus on one variable and two variables and works through a number of interesting first-order ODE models. It clearly uses MatLab in computational exercises where the models cannot be solved by hand, and also helps readers to understand that approximations cause errors - a fact that must always be kept in mind.
A comprehensive one-year graduate (or advanced undergraduate)
course in mathematical logic and foundations of mathematics. No
previous knowledge of logic is required; the book is suitable for
self-study. Many exercises (with hints) are included.
This book presents modern functional analysis methods for the sensitivity analysis of some infinite-dimensional systems governed by partial differential equations. The main topics are treated in a general and systematic way. They include many classical applications such as the Signorini problem, the elastic-plastic torsion problem and the visco-elastic-plastic problem. The "material derivative" from which any kind of shape derivative of a cost functional can be derived is defined. New results about the wave equation and the unilateral problem are also included in this book, which is intended to serve as a basic reference work for the algorithmic approach to shape optimization problems.
Develop your own trading system with practical guidance and expert advice In Building Algorithmic Trading Systems: A Trader's Journey From Data Mining to Monte Carlo Simulation to Live Training, award-winning trader Kevin Davey shares his secrets for developing trading systems that generate triple-digit returns. With both explanation and demonstration, Davey guides you step-by-step through the entire process of generating and validating an idea, setting entry and exit points, testing systems, and implementing them in live trading. You'll find concrete rules for increasing or decreasing allocation to a system, and rules for when to abandon one. The companion website includes Davey's own Monte Carlo simulator and other tools that will enable you to automate and test your own trading ideas. A purely discretionary approach to trading generally breaks down over the long haul. With market data and statistics easily available, traders are increasingly opting to employ an automated or algorithmic trading system enough that algorithmic trades now account for the bulk of stock trading volume. Building Algorithmic Trading Systems teaches you how to develop your own systems with an eye toward market fluctuations and the impermanence of even the most effective algorithm. * Learn the systems that generated triple-digit returns in the World Cup Trading Championship * Develop an algorithmic approach for any trading idea using off-the-shelf software or popular platforms * Test your new system using historical and current market data * Mine market data for statistical tendencies that may form the basis of a new system Market patterns change, and so do system results. Past performance isn't a guarantee of future success, so the key is to continually develop new systems and adjust established systems in response to evolving statistical tendencies. For individual traders looking for the next leap forward, Building Algorithmic Trading Systems provides expert guidance and practical advice.
This book addresses the question of how to achieve social coordination in Socio-Cognitive Technical Systems (SCTS). SCTS are a class of Socio-Technical Systems that are complex, open, systems where several humans and digital entities interact in order to achieve some collective endeavour. The book approaches the question from the conceptual background of regulated open multiagent systems, with the question being motivated by their design and construction requirements. The book captures the collective effort of eight groups from leading research centres and universities, each of which has developed a conceptual framework for the design of regulated multiagent systems and most have also developed technological artefacts that support the processes from specification to implementation of that type of systems. The first, introductory part of the book describes the challenge of developing frameworks for SCTS and articulates the premises and the main concepts involved in those frameworks. The second part discusses the eight frameworks and contrasts their main components. The final part maps the new field by discussing the types of activities in which SCTS are likely to be used, the features that such uses will exhibit, and the challenges that will drive the evolution of this field. |
![]() ![]() You may like...
Research Trends in Combinatorial…
William J. Cook, Laszlo Lovasz, …
Hardcover
R2,987
Discovery Miles 29 870
13th EAI International Conference on…
Chika Sugimoto, Hamed Farhadi, …
Hardcover
R5,667
Discovery Miles 56 670
Soft Computing in Inventory Management
Nita H. Shah, Mandeep Mittal
Hardcover
R4,116
Discovery Miles 41 160
Promoting Economic and Social…
Oscar Bernardes, Vanessa Amorim
Hardcover
R7,211
Discovery Miles 72 110
Blockchain-Enabled Resilience - An…
Polinpapilinho F. Katina, Adrian V. Gheorghe
Hardcover
R2,505
Discovery Miles 25 050
|