![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > General theory of computing
This book is dedicated to Prof. Dr. Heinz Gerhauser on the occasion of his retirement both from the position of Executive Director of the Fraunhofer Institute for Integrated Circuits IIS and from the Endowed Chair of Information Technologies with a Focus on Communication Electronics (LIKE) at the Friedrich-Alexander-Universitat Erlangen-Nurnberg. Heinz Gerhauser's vision and entrepreneurial spirit have made the Fraunhofer IIS one of the most successful and renowned German research institutions. He has been Director of the Fraunhofer IIS since 1993, and under his leadership it has grown to become the largest of Germany's 60 Fraunhofer Institutes, a position it retains to this day, currently employing over 730 staff. Likely his most important scientific as well as application-related contribution was his pivotal role in the development of the mp3 format, which would later become a worldwide success. The contributions to this Festschrift were written by both Fraunhofer IIS staff and external project team members in appreciation of Prof. Dr. Gerhauser's lifetime academic achievements and his inspiring leadership at the Fraunhofer IIS. The papers reflect the broad spectrum of the institute's research activities and are grouped into sections on circuits, information systems, visual computing, and audio and multimedia. They provide academic and industrial researchers in fields like signal processing, sensor networks, microelectronics, and integrated circuits with an up-to-date overview of research results that have a huge potential for cutting-edge industrial applications.
As technology grows as the largest source of modern economic growth, the emergence of new models is currently challenging the standard western model of organizational management. Companies from all over the world have succeeded in creating emerging economies with these new models and are now competing with established multinational corporations Organizational Innovation and IT Governance in Emerging Economies develops a methodological framework that supports new approaches of technological innovation by companies. This reference book provides contributions from experts in emerging economies, highlighting specific case studies of home grown companies from these emerging markets, offering lessons on how traditional multinationals can compete with these new companies for policymakers, government officers, academics, researchers, students, and practitioners.
In this book, the following three approaches to data analysis are presented: - Test Theory, founded by Sergei V. Yablonskii (1924-1998); the first publications appeared in 1955 and 1958, - Rough Sets, founded by Zdzis aw I. Pawlak (1926-2006); the first publications appeared in 1981 and 1982, - Logical Analysis of Data, founded by Peter L. Hammer (1936-2006); the first publications appeared in 1986 and 1988. These three approaches have much in common, but researchers active in one of these areas often have a limited knowledge about the results and methods developed in the other two. On the other hand, each of the approaches shows some originality and we believe that the exchange of knowledge can stimulate further development of each of them. This can lead to new theoretical results and real-life applications and, in particular, new results based on combination of these three data analysis approaches can be expected. - Logical Analysis of Data, founded by Peter L. Hammer (1936-2006); the first publications appeared in 1986 and 1988. These three approaches have much in common, but researchers active in one of these areas often have a limited knowledge about the results and methods developed in the other two. On the other hand, each of the approaches shows some originality and we believe that the exchange of knowledge can stimulate further development of each of them. This can lead to new theoretical results and real-life applications and, in particular, new results based on combination of these three data analysis approaches can be expected. These three approaches have much in common, but researchers active in one of these areas often have a limited knowledge about the results and methods developed in the other two. On the other hand, each of the approaches shows some originality and we believe that the exchange of knowledge can stimulate further development of each of them. This can lead to new theoretical results and real-life applications and, in particular, new results based on combination of these three data analysis approaches can be expected."
This book is devoted to the state-of-the-art in all aspects of fireworks algorithm (FWA), with particular emphasis on the efficient improved versions of FWA. It describes the most substantial theoretical analysis including basic principle and implementation of FWA and modeling and theoretical analysis of FWA. It covers exhaustively the key recent significant research into the improvements of FWA so far. In addition, the book describes a few advanced topics in the research of FWA, including multi-objective optimization (MOO), discrete FWA (DFWA) for combinatorial optimization, and GPU-based FWA for parallel implementation. In sequels, several successful applications of FWA on non-negative matrix factorization (NMF), text clustering, pattern recognition, and seismic inversion problem, and swarm robotics, are illustrated in details, which might shed new light on more real-world applications in future. Addressing a multidisciplinary topic, it will appeal to researchers and professionals in the areas of metahuristics, swarm intelligence, evolutionary computation, complex optimization solving, etc.
Intheworldweliveinconcurrencyisthenorm.Forexample,thehumanbody isamassivelyconcurrentsystem,comprisingahugenumberofcells,allsim- taneously evolving and independently engaging in their individual biological processing.Inaddition,inthebiologicalworld,trulysequentialsystemsrarely arise. However, they are more common when manmade artefacts are cons- ered. In particular, computer systems are often developed from a sequential perspective. Why is this? The simple reason is that it is easier for us to think about sequential, rather than concurrent, systems. Thus, we use sequentiality as a device to simplify the design process. However, the need for increasingly powerful, ?exible and usable computer systems mitigates against simplifying sequentiality assumptions. A good - ample of this is the all-powerful position held by the Internet, which is highly concurrent at many di?erent levels of decomposition. Thus, the modern c- puter scientist (and indeed the modern scientist in general) is forced to think aboutconcurrentsystemsandthesubtleandintricatebehaviourthatemerges from the interaction of simultaneously evolving components. Over a period of 25 years, or so, the ?eld of concurrency theory has been involved in the development of a set of mathematical techniques that can help system developers to think about and build concurrent systems. These theories are the subject matter of this book.
The electronics and information technology revolution continues, but it is a critical time in the development of technology. Once again, we stand on the brink of a new era where emerging research will yield exciting applications and products destined to transform and enrich our daily lives! The potential is staggering and the ultimate impact is unimaginable, considering the continuing marriage of te- nology with fields such as medicine, communications and entertainment, to name only a few. But who will actually be responsible for transforming these potential new pr- ucts into reality? The answer, of course, is today's (and tomorrow's) design en- neers! The design of integrated circuits today remains an essential discipline in s- port of technological progress, and the authors of this book have taken a giant step forward in the development of a practice-oriented treatise for design engineers who are interested in the practical, industry-driven world of integrated circuit - sign.
The Third Conference on Applied Mathematics and Scienti?c Computing took place June 23-27, 2003 on island of Brijuni, Croatia. The main goal of the conference was to interchange ideas among applied mathematicians in the broadest sense both from and outside academia, as well as experts from other areas who apply different mathematical techniques. During the meeting there were invited and contributed talksand software presentations. Invited presentations were given by active researchers from the ?eldsof approximation theory, numerical methods for differential equations and numericallinear algebra. These proceedings contain research and review papers by invited speakers and selected contributed papers from the ?elds of applied and numerical mathematics. A particular aim of the conference was to encourage young scientists to present results of their research. Traditionally, the best presentation given by PhD student was rewarded. This year awardee was Luka Grubisi ? c ' (University of Hagen, Hagen, Germany) and we congratulate him for this achievement. It would be hard to organize the conference without generous support of the Croatian Ministry of Science and Technology and we acknowledge it. We are also indebted to themainorganizer, Department of Mathematics, University of Zagreb.Motivating beautiful nature should bealso mentioned.And,attheend, we are thankful to Drs. JosipTambaca ? and Ivica Nakic ' for giving this book its ?nal shape.
It is the aim of INDICES to document recent explorations in the various fields of philosophical logic and formal linguistics and their applications in other disciplines. The main emphasis of this series is on self-contained monographs covering particular areas of recent research and surveys of methods, problems, and results in all fields of inquiry where recourse to logical analysis and logical methods has been fruitful. INDICES will contain monographs dealing with the central areas of philosophical logic (extensional and intensional systems, indexical logics, non-classical logics, philosophy of logic, etc.) as well as studies in which these systems are applied to specific issues in philosophy, in the formal semantics of natural languages, the foundations of linguistic theory, in computational linguistics, and in theoretical computer science. Constructive type theory was first presented in 1970, by the Swedish logician Per Martin-Lof. It has become one of the main approaches used in the foundations of mathematics and computer science. But it has remained relatively unknown among linguists and philosophers, although it provides a considerable extension of the concepts and techniques of logic. The book first gives an introduction to type theory from the point of view of linguistics and the philosophy of language. Type theory is then applied in the areas of quantification, anaphora, temporal reference, and the structure of text and discourse. By virtue of the type-theoretical concepts of proof object and context, various phenomena of dependence and progression in language can be discussed in precise terms, and several well-known problems can be solved. A categorial grammar is presented togenerate formally a fragment of English, together with an example of a computer implementation.
Assertion-based design is a powerful new paradigm that is facilitating quality improvement in electronic design. Assertions are statements used to describe properties of the design (I.e., design intent), that can be included to actively check correctness throughout the design cycle and even the lifecycle of the product. With the appearance of two new languages, PSL and SVA, assertions have already started to improve verification quality and productivity. This is the first book that presents an under-the-hood view of generating assertion checkers, and as such provides a unique and consistent perspective on employing assertions in major areas, such as: specification, verification, debugging, on-line monitoring and design quality improvement.
This book presents a comprehensive overview of the various aspects for the development of smart cities from a European perspective. It presents both theoretical concepts as well as empirical studies and cases of smart city programs and their capacity to create value for citizens. The contributions in this book are a result of an increasing interest for this topic, supported by both national governments and international institutions. The book offers a large panorama of the most important aspects of smart cities evolution and implementation. It compares European best practices and analyzes how smart projects and programs in cities could help to improve the quality of life in the urban space and to promote cultural and economic development.
"Computational Analysis of Terrorist Groups: Lashkar-e-Taiba "provides an in-depth look at Web intelligence, and how advanced mathematics and modern computing technology can influence the insights we have on terrorist groups. This book primarily focuses on one famous terrorist group known as Lashkar-e-Taiba (or LeT), and how it operates.After 10 years of counter Al Qaeda operations, LeT is considered by many in the counter-terrorism community to be an even greater threat to the US and world peace than Al Qaeda. "Computational Analysis of Terrorist Groups: Lashkar-e-Taiba "is the first book that demonstrates how to use modern computational analysis techniques including methods for "big data" analysis. This book presents how to quantify both the environment in which LeT operate, and the actions it took over a 20-year period, and represent it as a relational database table. This table is then mined using sophisticated data mining algorithms in order to gain detailed, mathematical, computational and statistical insights into LeT and its operations.This book also provides a detailed history of Lashkar-e-Taiba based on extensive analysis conducted by using open source information and public statements. Each chapter includes a case study, as well as a slide describing the key results which are available on the authors' web sites. "Computational Analysis of Terrorist Groups: Lashkar-e-Taiba "is designed for a professional market composed of government or military workers, researchers and computer scientists working in the web intelligence field. Advanced-level students in computer science will also find this valuable as a reference book."
Evolutionary algorithms (EAs) is now a mature problem-solving family of heuristics that has found its way into many important real-life problems and into leading-edge scientific research. Spatially structured EAs have different properties than standard, mixing EAs. By virtue of the structured disposition of the population members they bring about new dynamical features that can be harnessed to solve difficult problems faster and more efficiently. This book describes the state of the art in spatially structured EAs by using graph concepts as a unifying theme. The models, their analysis, and their empirical behavior are presented in detail. Moreover, there is new material on non-standard networked population structures such as small-world networks. The book should be of interest to advanced undergraduate and graduate students working in evolutionary computation, machine learning, and optimization. It should also be useful to researchers and professionals working in fields where the topological structures of populations and their evolution plays a role.
'Behavior' is an increasingly important concept in the scientific, societal, economic, cultural, political, military, living and virtual worlds. Behavior computing, or behavior informatics, consists of methodologies, techniques and practical tools for examining and interpreting behaviours in these various worlds. Behavior computing contributes to the in-depth understanding, discovery, applications and management of behavior intelligence. With contributions from leading researchers in this emerging field Behavior Computing: Modeling, Analysis, Mining and Decision includes chapters on: representation and modeling behaviors; behavior ontology; behaviour analysis; behaviour pattern mining; clustering complex behaviors; classification of complex behaviors; behaviour impact analysis; social behaviour analysis; organizational behaviour analysis; and behaviour computing applications. Behavior Computing: Modeling, Analysis, Mining and Decision provides a dedicated source of reference for the theory and applications of behavior informatics and behavior computing. Researchers, research students and practitioners in behavior studies, including computer science, behavioral science, and social science communities will find this state of the art volume invaluable.
This is the first text and monograph about DNA computing, a molecular approach that might revolutionize our thinking and ideas about computing. Although it is too soon to predict whether computer hardware to change from silicon to carbon and from microchips to DNA molecules, the theoretical premises have already been studied extensively. The book starts with an introduction to DNA-related matters, the basics of biochemistry and language and computation theory, and progresses to the most advanced mathematical theory developed so far in the area. All three authors are pioneers in the theory of DNA computing. Apart from being well-known scientists, they are known for their lucid writing. Many of their previous books have become classics in their field, and this book too is sure to follow their example.
Introduction The International Federation for Information Processing (IFIP) is a non-profit umbrella organization for national societies working in the field of information processing. It was founded in 1960 under the auspices of UNESCO. It is organized into several technical c- mittees. This book represents the proceedings of the 2008 conference of technical committee 8 (TC8), which covers the field of infor- tion systems. TC8 aims to promote and encourage the advancement of research and practice of concepts, methods, techniques and issues related to information systems in organisations. TC8 has established eight working groups covering the following areas: design and evaluation of information systems; the interaction of information systems and the organization; decision support systems; e-business information systems: multi-disciplinary research and practice; inf- mation systems in public administration; smart cards, technology, applications and methods; and enterprise information systems. Further details of the technical committee and its working groups can be found on our website (ifiptc8. dsi. uminho. pt). This conference was part of IFIP's World Computer Congress in Milan, Italy which took place 7-10 September 2008. The occasion celebrated the 32nd anniversary of IFIP TC8. The call for papers invited researchers, educators, and practitioners to submit papers and panel proposals that advance concepts, methods, techniques, tools, issues, education, and practice of information systems in organi- tions. Thirty one submissions were received.
Global Optimization has emerged as one of the most exciting new areas of mathematical programming. Global optimization has received a wide attraction from many fields in the past few years, due to the success of new algorithms for addressing previously intractable problems from diverse areas such as computational chemistry and biology, biomedicine, structural optimization, computer sciences, operations research, economics, and engineering design and control. This book contains refereed invited papers submitted at the 4th international confer ence on Frontiers in Global Optimization held at Santorini, Greece during June 8-12, 2003. Santorini is one of the few sites of Greece, with wild beauty created by the explosion of a volcano which is in the middle of the gulf of the island. The mystic landscape with its numerous mult-extrema, was an inspiring location particularly for researchers working on global optimization. The three previous conferences on "Recent Advances in Global Opti mization," "State-of-the-Art in Global Optimization," and "Optimization in Computational Chemistry and Molecular Biology: Local and Global approaches" took place at Princeton University in 1991, 1995, and 1999, respectively. The papers in this volume focus on de terministic methods for global optimization, stochastic methods for global optimization, distributed computing methods in global optimization, and applications of global optimiza tion in several branches of applied science and engineering, computer science, computational chemistry, structural biology, and bio-informatics."
This book gives senior undergraduate and beginning graduate students and researchers in computer vision, applied mathematics, computer graphics, and robotics a self-contained introduction to the geometry of 3D vision; that is the reconstruction of 3D models of objects from a collection of 2D images. Following a brief introduction, Part I provides background materials for the rest of the book. The two fundamental transformations, namely rigid body motion and perspective projection are introduced and image formation and feature extraction discussed. Part II covers the classic theory of two view geometry based on the so-called epipolar constraint. Part III shows that a more proper tool for studying the geometry of multiple views is the so- called rank considtion on the multiple view matrix. Part IV develops practical reconstruction algorithms step by step as well as discusses possible extensions of the theory. Exercises are provided at the end of each chapter. Software for examples and algorithms are available on the author's website.
This is a book about a code and about coding. The code is a case study which has been used to teachcourses in e-Science atthe Australian NationalUniv- sity since 2001. Students learn advanced programming skills and techniques TM in the Java language. Above all, they learn to apply useful object-oriented design patterns as they progressively refactor and enhance the software. We think our case study,EScope, is as close to real life as you can get! It is a smaller version of a networked, graphical, waveform browser which is used in the control rooms of fusion energy experiments around the world. It is quintessential "e-Science" in the sense of e-Science being "computer science and information technology in the service of science". It is not, speci?cally, "Grid-enabled", but we develop it in a way that will facilitate its deployment onto the Grid. The standard version ofEScope interfaces with a specialised database for waveforms, and related data, known asMDSplus. On the acc- panying CD, we have provided you with software which will enable you to installMDSplus,EScope and sample data ?les onto Windows or Linux c- puters. There is much additional software including many versions of the case study as it gets built up and progressively refactored using design patterns. There will be a home web-site for this book which will contain up-to-date information about the software and other aspects of the case study.
A Modular Calculus for the Average Cost of Data Structuring introduces MOQA, a new domain-specific programming language which guarantees the average-case time analysis of its programs to be modular.Time in this context refers to a broad notion of cost, which can be used to estimate the actual running time, but also other quantitative information such as power consumption, while modularity means that the average time of a program can be easily computed from the times of its constituents--something that no programming language of this scope has been able to guarantee so far. MOQA principles can be incorporated in any standard programming language. MOQA supports tracking of data and their distributions throughout computations, based on the notion of random bag preservation. This allows a unified approach to average-case time analysis, and resolves fundamental bottleneck problems in the area. The main techniques are illustrated in an accompanying Flash tutorial, where the visual nature of this method can provide new teaching ideas for algorithms courses. This volume, with forewords by Greg Bollella and Dana Scott, presents novel programs based on the new advances in this area, including the first randomness-preserving version of Heapsort. Programs are provided, along with derivations of their average-case time, to illustrate the radically different approach to average-case timing. The automated static timing tool applies the Modular Calculus to extract the average-case running time of programs directly from their MOQA code. A Modular Calculus for the Average Cost of Data Structuring is designed for a professional audience composed of researchers and practitioners in industry, with an interest in algorithmic analysis and also static timing and power analysis--areas of growing importance. It is also suitable as an advanced-level text or reference book for students in computer science, electrical engineering and mathematics. Michel Schellekens obtained his PhD from Carnegie Mellon University, following which he worked as a Marie Curie Fellow at Imperial College London. Currently he is an Associate Professor at the Department of Computer Science in University College Cork - National University of Ireland, Cork, where he leads the Centre for Efficiency-Oriented Languages (CEOL) as a Science Foundation Ireland Principal Investigator.
ClearRevise is all about making your revision easy. At the end of the course, doing practice papers is useful - but an exam tutor can make a big difference. This book helps provide support from both angles and will really help you to ace the exam. The first section is your exam tutor. It shows you example questions with model answers. Just like a tutor, it gives you exam tips and lets you know what the examiner is looking for. Secondly, you are then given similar questions from the same topic for you to have a go at, applying your knowledge and tips. With over 400 marks in this section and all the answers provided you'll easily revise the topics as you go. Lastly, there are two complete exam papers written in the same style as the live OCR papers to try. They're exactly the same length and marks as the real exam, providing a realistic experience and a great opportunity to show how much you've progressed.
This book records the very first Working Conference of the newly established IFIP Working Group on Human-Work Interaction Design, which was hosted by the University of Madeira in 2006. The theme of the conference was on synthesizing work analysis and design sketching, with a particular focus on how to read design sketches within different approaches to analysis and design of human-work interaction. Authors were encouraged to submit papers about design sketches - for interfaces, for organizations of work etc. - that they themselves had worked on. During the conference, they presented the lessons they had learnt from the design and evaluation process, citing reasons for why the designs worked or why they did not work. Researchers, designers and analysts in this way confronted concrete design problems in complex work domains and used this unique opportunity to share their own design problems and solutions with the community. To successfully practice and do research within Human - Work Interaction Design requires a high level of personal skill, which the conference aimed at by confronting designers and work analysts and those whose research is both analysis and design. They were asked to collaborate in small groups about analysis and solutions to a common design problem.
High-speed, power-efficient analog integrated circuits can be used as standalone devices or to interface modern digital signal processors and micro-controllers in various applications, including multimedia, communication, instrumentation, and control systems. New architectures and low device geometry of complementary metaloxidesemiconductor (CMOS) technologies have accelerated the movement toward system on a chip design, which merges analog circuits with digital, and radio-frequency components. |
You may like...
Higher Dimensional Varieties and…
Karoly Jr. Boeroeczky, Janos Kollar, …
Hardcover
R2,899
Discovery Miles 28 990
Living Culturally Responsive Mathematics…
Cynthia Nicol, Jo-ann Archibald Q'um Q'um Xiiem, …
Paperback
R1,495
Discovery Miles 14 950
Identity Theft - Breakthroughs in…
Information Resources Management Association
Hardcover
R8,567
Discovery Miles 85 670
Bridging Algebra, Geometry, and Topology
Denis Ibadula, Willem Veys
Hardcover
Training Teachers for Bilingual…
Jose Luis Estrada Chichon, Francisco Zayas Martinez
Hardcover
R6,687
Discovery Miles 66 870
Long-term Research and Development in…
Avi Hofstein, Abraham Arcavi, …
Paperback
R1,906
Discovery Miles 19 060
Education studies for initial teacher…
Labby Ramrathan, Lesley Le Grange, …
Paperback
|