![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > General theory of computing
This book provides a broad overview of the many card systems and solutions that are in practical use today. This new edition adds content on RFIDs, embedded security, attacks and countermeasures, security evaluation, javacards, banking or payment cards, identity cards and passports, mobile systems security, and security management. A step-by-step approach educates the reader in card types, production, operating systems, commercial applications, new technologies, security design, attacks, application development, deployment and lifecycle management. By the end of the book the reader should be able to play an educated role in a smart card related project, even to programming a card application. This book is designed as a textbook for graduate level students in computer science. It is also as an invaluable post-graduate level reference for professionals and researchers. This volume offers insight into benefits and pitfalls of diverse industry, government, financial and logistics aspects while providing a sufficient level of technical detail to support technologists, information security specialists, engineers and researchers.
E-Business covers a broad spectrum of businesses based on the Internet, including e-commerce, e-healthcare, e-government and e tailing. While substantial attention is being given to the planning and development of e-business applications, the efficiency and effectiveness of e-business systems will largely depend on management solutions. These management solutions demand a good grasp of both the technical and business perspectives of an e-business service. There have been many books on the Internet based on e-commerce, Internet protocols, distributed components etc. However, none of these books address the problem of managing e business as a set of networked services. They do not link enterprise management with network and systems management. This book provides an overview of the emerging techniques for IT service management from a business perspective with case studies from telecommunication and healthcare sectors. It integrates the business perspective with relevant technical standards, such as SNMP, WBEM and DMI. This book presents some concepts and methodologies that enable the development of effective and efficient management systems for networked services. The book is intended to familiarize practicing managers, engineers, and graduate level students with networked service management concepts, architectures and methodologies with reference to evolving standards. It should be useful in a number of disciplines, such as business management, information systems, computers and networking, and telecommunications. Appendix 2 is based on TeleManagement (TM) Forum's documents on TOM (GB921, GB910 and GB908). While this appendix has explained the basic management concept of an e-telco, TMForum now recommends the use of eTOM as explained in www.tmforum.com. An overview of eTOM is available in the report The TeleManagement Forum's enhanced Telecom Operations Map (eTOM) by Michael Kelly appearing in the Journal of Network and Systems Management in March 2003.
Business processes are becoming increasingly complex and dynamic as they seek to cope with a wide range of internal and external interactions and changes. ""The Handbook of Research on Complex Dynamic Process Management: Techniques for Adaptability in Turbulent Environments"" investigates the nature and history of dynamic processes essential to understanding the need for flexibility and adaptability as well as the requirements to improve solutions. This innovative collection covers the development of various strategies, architectures, and techniques for achieving adaptive processes in environments.
An introduction to parallel programming with openmpi using C. It is written so that someone with even a basic understanding of programming can begin to write mpi based parallel programs.
This book discusses the elementary ideas and tools needed for open quantum systems in a comprehensive manner. The emphasis is given to both the traditional master equation as well as the functional (path) integral approaches. It discusses the basic paradigm of open systems, the harmonic oscillator and the two-level system in detail. The traditional topics of dissipation and tunneling, as well as the modern field of quantum information, find a prominent place in the book. Assuming a basic background of quantum and statistical mechanics, this book will help readers familiarize with the basic tools of open quantum systems. Open quantum systems is the study of quantum dynamics of the system of interest, taking into account the effects of the ambient environment. It is ubiquitous in the sense that any system could be envisaged to be surrounded by its environment which could naturally exert its influence on it. Open quantum systems allows for a systematic understanding of irreversible processes such as decoherence and dissipation, of the essence in order to have a correct understanding of realistic quantum dynamics and also for possible implementations. This would be essential for a possible development of quantum technologies.
The book is centered around the research areas of combinatorics, special functions, and computer algebra. What these research fields share is that many of their outstanding results do not only have applications in Mathematics, but also other disciplines, such as computer science, physics, chemistry, etc. A particular charm of these areas is how they interact and influence one another. For instance, combinatorial or special functions' techniques have motivated the development of new symbolic algorithms. In particular, first proofs of challenging problems in combinatorics and special functions were derived by making essential use of computer algebra. This book addresses these interdisciplinary aspects. Algorithmic aspects are emphasized and the corresponding software packages for concrete problem solving are introduced. Readers will range from graduate students, researchers to practitioners who are interested in solving concrete problems within mathematics and other research disciplines.
Recent improvements in healthcare delivery due to innovative technological advancements have redefined fields of biomedical science now allowing for enhanced information management, resource allocation, and quality assurance. Biocomputation and Biomedical Informatics: Case Studies and Applications provides a compendium of terms, definitions, and explanations of concepts, processes, and acronyms in this significant medical field of study. Featuring chapters authored by leading international experts, this unsurpassed collection provides a defining body of research indispensible to medical libraries, researchers, and institutions worldwide.
This is a central topic in any computer science curriculum. To distinguish this textbook from others, the author considers probabilistic methods as being fundamental for the construction of simple and efficient algorithms, and in each chapter at least one problem is solved using a randomized algorithm. Data structures are discussed to the extent needed for the implementation of the algorithms. The specific algorithms examined were chosen because of their wide field of application. This book originates from lectures for undergraduate and graduate students. The text assumes experience in programming algorithms, especially with elementary data structures such as chained lists, queues, and stacks. It also assumes familiarity with mathematical methods, although the author summarizes some basic notations and results from probability theory and related mathematical terminology in the appendices. He includes many examples to explain the individual steps of the algorithms, and he concludes each chapter with numerous exercises.
This book contains a selection of papers presented during a special workshop on Complexity Science organized as part of the 9th International Conference on GIScience 2016. Expert researchers in the areas of Agent-Based Modeling, Complexity Theory, Network Theory, Big Data, and emerging methods of Analysis and Visualization for new types of data explore novel complexity science approaches to dynamic geographic phenomena and their applications, addressing challenges and enriching research methodologies in geography in a Big Data Era.
Constant improvements in technological applications have allowed for more opportunities to develop automated systems. This not only leads to higher success in smart data analysis, but also ensures that technological progression will continue. Ubiquitous Machine Learning and its Applications is a pivotal reference source for the latest research on the issues and challenges machines face in the new millennium. Featuring extensive coverage on relevant areas such as computational advertising, software engineering, and bioinformatics, this publication is an ideal resource for academicians, graduate students, engineering professionals, and researchers interested in discovering how they can apply these advancements to various disciplines.
'New Technologies in Hospital Information Systems' is launched by the European Telematics Applications Project 'Healthcare Advanced Networked System Architecture' (HANSA) with support of the GMDS WG Hospital Information Systems and the GMDS FA Medical Informatics. It contains 28 high quality papers dealing with architectural concepts, models and developments for hospital information systems. The book has been organized in seven sections: Reference Architectures, Modelling and Applications, The Distributed Healthcare Environment, Intranet Solutions, Object Orientation, Networked Solutions and Standards and Applications. The HANSA project is based upon the European Pre-standard for Healthcare Information System Architecture which has been drawn up by CEN/TC 251 PT01-13. The editors felt that this standard will have a major impact on future developments for hospital information systems. Therefore the standard is completely included as an appendix.
This book discusses the formalization of mathematical theories centering on complex analysis and matrix theory, covering topics such as algebraic systems, complex numbers, gauge integration, the Fourier transformation and its discrete counterpart, matrices and their transformation, inner product spaces, and function matrices. The formalization is performed using the interactive theorem prover HOL4, chiefly developed at the University of Cambridge. Many of the developments presented are now integral parts of the library of this prover. As mathematical developments continue to gain in complexity, sometimes demanding proofs of enormous sizes, formalization has proven to be invaluable in terms of obtaining real confidence in their correctness. This book provides a basis for the computer-aided verification of engineering systems constructed using the principles of complex analysis and matrix theory, as well as building blocks for the formalization of more involved mathematical theories.
This book is about describing the meaning of programming languages. The author teaches the skill of writing semantic descriptions as an efficient way to understand the features of a language. While a compiler or an interpreter offers a form of formal description of a language, it is not something that can be used as a basis for reasoning about that language nor can it serve as a definition of a programming language itself since this must allow a range of implementations. By writing a formal semantics of a language a designer can yield a far shorter description and tease out, analyse and record design choices. Early in the book the author introduces a simple notation, a meta-language, used to record descriptions of the semantics of languages. In a practical approach, he considers dozens of issues that arise in current programming languages and the key techniques that must be mastered in order to write the required formal semantic descriptions. The book concludes with a discussion of the eight key challenges: delimiting a language (concrete representation), delimiting the abstract content of a language, recording semantics (deterministic languages), operational semantics (non-determinism), context dependency, modelling sharing, modelling concurrency, and modelling exits. The content is class-tested and suitable for final-year undergraduate and postgraduate courses. It is also suitable for any designer who wants to understand languages at a deep level. Most chapters offer projects, some of these quite advanced exercises that ask for complete descriptions of languages, and the book is supported throughout with pointers to further reading and resources. As a prerequisite the reader should know at least one imperative high-level language and have some knowledge of discrete mathematics notation for logic and set theory.
The financial meltdown resulting from the subprime mortgage fiasco culminated in the most dramatic economic slowdown since the Great Depression. The global economic crisis raised the debate about the role of financial institutions and the role of regulators in an increasingly interconnected and rapidly changing world. It also altered the marketplace's perception of historically trusted financial institutions. Over the years, geopolitical, economic and technical trends have had a subtle, but very powerful, impact on the basic business model for financial institutions worldwide and on their interactions with accountholders. Add to that increased margin pressures, regulatory and compliance issues, fraud and compliance concerns, and competitive threats, and it becomes obvious that the old business model simply won't work going forward. At the same time, the financial industry is littered with some of the oldest technologies of any industry, which contributed to the poor credit decisions that fueled the crisis. A recognized entrepreneur and award-winning innovator, Louis Hernandez, Jr., using historical examples, points out that the rate of change impacting the financial services industry is accelerating. The industry has been slow to respond to change, and the focus on the recent crisis has uncovered fundamental problems that financial institutions have been avoiding. Hernandez outlines a process to map the future direction of individual institutions and the industry in a way that addresses near-term issues and overarching global changes, such as a re-emergent Asia and the dynamics of a knowledge economy. He points out that the "Too Big to Fail" thesis has given way to the seemingly more prudent, community-based institutions that largely avoided the subprime crisis. These institutions have demonstrated that they represent a unique pillar of economic stability. Now, he says, is the perfect time for the leaders of these community-based institutions to seize the day and lead the financial services industry back to the center of economic vitality and drive global economic growth, one community at a time. In Too Small to Fail, Hernandez issues the call to action, "Do you have the extraordinary drive it will take to inspire the industry and bring financial institutions back to their place as trusted intermediaries?"
This book offers a holistic framework to study behavior and evolutionary dynamics in large-scale, decentralized, and heterogeneous crowd networks. In the emerging crowd cyber-ecosystems, millions of deeply connected individuals, smart devices, government agencies, and enterprises actively interact with each other and influence each other's decisions. It is crucial to understand such intelligent entities' behaviors and to study their strategic interactions in order to provide important guidelines on the design of reliable networks capable of predicting and preventing detrimental events with negative impacts on our society and economy. This book reviews the fundamental methodologies to study user interactions and evolutionary dynamics in crowd networks and discusses recent advances in this emerging interdisciplinary research field. Using information diffusion over social networks as an example, it presents a thorough investigation of the impact of user behavior on the network evolution process and demonstrates how this can help improve network performance. Intended for graduate students and researchers from various disciplines, including but not limited to, data science, networking, signal processing, complex systems, and economics, the book encourages researchers in related research fields to explore the many untouched areas in this domain, and ultimately to design crowd networks with efficient, effective, and reliable services.
The acceleration of the Internet and the growing importance of ICT in the globalized markets have played a vital role in the progressively difficult standardization of ICT companies. With the related economic importance of standards, companies and organizations are bringing their own ideas and technologies into the Internet's standard settings.Innovations in Organizational IT Specification and Standards Development provides advancing research on all current aspects of IT standards and standardization. This book aims to be useful in gaining knowledge for IT researchers, scholars, and practitioners alike.
This book introduces new models based on R-calculus and theories of belief revision for dealing with large and changing data. It extends R-calculus from first-order logic to propositional logic, description logics, modal logic and logic programming, and from minimal change semantics to subset minimal change, pseudo-subformula minimal change and deduction-based minimal change (the last two minimal changes are newly defined). And it proves soundness and completeness theorems with respect to the minimal changes in these logics. To make R-calculus computable, an approximate R-calculus is given which uses finite injury priority method in recursion theory. Moreover, two applications of R-calculus are given to default theory and semantic inheritance networks. This book offers a rich blend of theory and practice. It is suitable for students, researchers and practitioners in the field of logic. Also it is very useful for all those who are interested in data, digitization and correctness and consistency of information, in modal logics, non monotonic logics, decidable/undecidable logics, logic programming, description logics, default logics and semantic inheritance networks.
This book covers all of the concepts required to tackle second-order cone programs (SOCPs), in order to provide the reader a complete picture of SOC functions and their applications. SOCPs have attracted considerable attention, due to their wide range of applications in engineering, data science, and finance. To deal with this special group of optimization problems involving second-order cones (SOCs), we most often need to employ the following crucial concepts: (i) spectral decomposition associated with SOCs, (ii) analysis of SOC functions, and (iii) SOC-convexity and -monotonicity. Moreover, we can roughly classify the related algorithms into two categories. One category includes traditional algorithms that do not use complementarity functions. Here, SOC-convexity and SOC-monotonicity play a key role. In contrast, complementarity functions are employed for the other category. In this context, complementarity functions are closely related to SOC functions; consequently, the analysis of SOC functions can help with these algorithms.
The book focuses on system dependability modeling and calculation, considering the impact of s-dependency and uncertainty. The best suited approaches for practical system dependability modeling and calculation, (1) the minimal cut approach, (2) the Markov process approach, and (3) the Markov minimal cut approach as a combination of (1) and (2) are described in detail and applied to several examples. The stringently used Boolean logic during the whole development process of the approaches is the key for the combination of the approaches on a common basis. For large and complex systems, efficient approximation approaches, e.g. the probable Markov path approach, have been developed, which can take into account s-dependencies be-tween components of complex system structures. A comprehensive analysis of aleatory uncertainty (due to randomness) and epistemic uncertainty (due to lack of knowledge), and their combination, developed on the basis of basic reliability indices and evaluated with the Monte Carlo simulation method, has been carried out. The uncertainty impact on system dependability is investigated and discussed using several examples with different levels of difficulty. The applications cover a wide variety of large and complex (real-world) systems. Actual state-of-the-art definitions of terms of the IEC 60050-192:2015 standard, as well as the dependability indices, are used uniformly in all six chapters of the book.
This is a comprehensive study of various time-dependent scheduling problems in single-, parallel- and dedicated-machine environments. In addition to complexity issues and exact or heuristic algorithms which are typically presented in scheduling books, the author also includes more advanced topics such as matrix methods in time-dependent scheduling, time-dependent scheduling with two criteria and time-dependent two-agent scheduling. The reader should be familiar with the basic notions of calculus, discrete mathematics and combinatorial optimization theory, while the book offers introductory material on theory of algorithms, NP-complete problems, and the basics of scheduling theory. The author includes numerous examples, figures and tables, he presents different classes of algorithms using pseudocode, he completes all chapters with extensive bibliographies, and he closes the book with comprehensive symbol and subject indexes. The previous edition of the book focused on computational complexity of time-dependent scheduling problems. In this edition, the author concentrates on models of time-dependent job processing times and algorithms for solving time-dependent scheduling problems. The book is suitable for researchers working on scheduling, problem complexity, optimization, heuristics and local search algorithms.
Ada's Legacy illustrates the depth and diversity of writers, thinkers, and makers who have been inspired by Ada Lovelace, the English mathematician and writer. The volume, which commemorates the bicentennial of Ada's birth in December 1815, celebrates Lovelace's many achievements as well as the impact of her life and work, which reverberated widely since the late nineteenth century. In the 21st century we have seen a resurgence in Lovelace scholarship, thanks to the growth of interdisciplinary thinking and the expanding influence of women in science, technology, engineering and mathematics. Ada's Legacy is a unique contribution to this scholarship, thanks to its combination of papers on Ada's collaboration with Charles Babbage, Ada's position in the Victorian and Steampunk literary genres, Ada's representation in and inspiration of contemporary art and comics, and Ada's continued relevance in discussions around gender and technology in the digital age. With the 200th anniversary of Ada Lovelace's birth on December 10, 2015, we believe that the timing is perfect to publish this collection of papers. Because of its broad focus on subjects that reach far beyond the life and work of Ada herself, Ada's Legacy will appeal to readers who are curious about Ada's enduring importance in computing and the wider world.
CAMD or Computer Aided Molecular Design refers to the design of
molecules with desirable properties. That is, through CAMD, one
determines molecules that match a specified set of (target)
properties. CAMD as a technique has a very large potential as in
principle, all kinds of chemical, bio-chemical and material
products can be designed through this technique. |
![]() ![]() You may like...
Programming Finite Elements in Java (TM)
Gennadiy P. Nikishkov
Hardcover
R2,709
Discovery Miles 27 090
Formal Methods for Open Object-Based…
Paolo Ciancarini, Alessandro Fantechi, …
Hardcover
R5,838
Discovery Miles 58 380
TAPSOFT '87: Proceedings of the…
Hartmut Ehrig, Robert Kowalski, …
Paperback
R1,500
Discovery Miles 15 000
|