Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Showing 1 - 25 of 25 matches in All Departments
Calculi of temporal logic are widely used in modern computer science. The temporal organization of information flows in the different architectures of laptops, the Internet, or supercomputers would not be possible without appropriate temporal calculi. In the age of digitalization and High-Tech applications, people are often not aware that temporal logic is deeply rooted in the philosophy of modalities. A deep understanding of these roots opens avenues to the modern calculi of temporal logic which have emerged by extension of modal logic with temporal operators. Computationally, temporal operators can be introduced in different formalisms with increasing complexity such as Basic Modal Logic (BML), Linear-Time Temporal Logic (LTL), Computation Tree Logic (CTL), and Full Computation Tree Logic (CTL*). Proof-theoretically, these formalisms of temporal logic can be interpreted by the sequent calculus of Gentzen, the tableau-based calculus, automata-based calculus, game-based calculus, and dialogue-based calculus with different advantages for different purposes, especially in computer science.The book culminates in an outlook on trendsetting applications of temporal logics in future technologies such as artificial intelligence and quantum technology. However, it will not be sufficient, as in traditional temporal logic, to start from the everyday understanding of time. Since the 20th century, physics has fundamentally changed the modern understanding of time, which now also determines technology. In temporal logic, we are only just beginning to grasp these differences in proof theory which needs interdisciplinary cooperation of proof theory, computer science, physics, technology, and philosophy.
This book is for graduate students and researchers, introducing modern foundational research in mathematics, computer science, and philosophy from an interdisciplinary point of view. Its scope includes proof theory, constructive mathematics and type theory, univalent mathematics and point-free approaches to topology, extraction of certified programs from proofs, automated proofs in the automotive industry, as well as the philosophical and historical background of proof theory. By filling the gap between (under-)graduate level textbooks and advanced research papers, the book gives a scholarly account of recent developments and emerging branches of the aforementioned fields.
Time is fundamental to our experience, but remains mysterious. This book shows how philosophers and scientists have tried to grapple with this most extraordinary of ordinary phenomena. From the attempts of early astronomers to reconcile solar and lunar and terrestrial reckonings, to the huge expansions and contractions of time consciousness brought on by scientists as diverse as Newton, Darwin, and Einstein, this book shows how time is as much a matter of human choice as it is a matter of scientific precision.
Here, expert authors delineate approaches that can support both decision makers as well as their concerned populations in overcoming unwarranted fears and in elaborating policies based on scientific evidence. Four exemplary focus areas were chosen for in-depth review, namely:- The scientific basis of risk management- Risk management in the area of environmental and ecological policy- Risk management in radiation medicine- Risk management in context with digitalization and roboticsGeneral as well as specific recommendations are summarized in a memorandum. Fundamental thoughts on the topic are presented in the introductory part of the book. The idea for and contents of the book were developed at a workshop on "Sustainable Risk Management: How to manage risks in a sensible and responsible manner?" held in Feldafing at Lake Starnberg (Germany) on April 14 to 16, 2016. The book offers important information and advice for scientists, entrepreneurs, administrators and politicians.
The theory of nonlinear, complex systems has become by now a proven problem-solving approach in the natural sciences. And it is now also recognized that many if not most of our social, ecological, economical and political problems are essentially of a global, complex and nonlinear nature. And it is now further accepted than any holistic perspective of the human mind and brain can hardly be achieved by any other approach. In this wide-ranging, scholarly but very concise treatment, physicist, computer scientist and philosopher Klaus Mainzer discusses, in essentially nontechnical language, the common framework behind these ideas and challenges. Emphasis is given to the evolution of new structures in natural and cultural systems and we are lead to see clearly how the new integrative approach can give insights not available from traditional reductionistic methods. The fifth edition enlarges and revises almost all sections and supplements an entirely new chapter on the complexity of economic systems. From the reviews of the fourth edition: "This book is ambitious, incredibly erudite with 22 pages of references, and is indisputably clearly and beautifully written and illustrated. It is perfectly suited to a first course on the science of complexity. Even beginners and young graduate students will have something to learn from this book." (Andre Hautot, Physicalia, Vol. 57 (3), 2005) "All-in-all, this highly recommended book is a wonderful resource for intuitive basic ideas in the need of rigorous formulation." (Albert A. Mullin, Zentralblatt MATH, vol. 1046, 2004) "Readers of this book will enjoy Mainzer's exposition, which is based on a tight coupling between classical andhistorical concepts from Plato and Aristotle to modern, mathematical and physical developments . Every chapter begins with a section designed to orient the reader to the perspective of philosophical developments through the ages pertinent to the topic at hand. The author takes pains to point out essential differences between classical science and the science of complexity. Thinking in Complexity is an outstandingly readable book." (Anutosh Moitra, The Industrial Physicist, August/September, 2004)
This book is for graduate students and researchers, introducing modern foundational research in mathematics, computer science, and philosophy from an interdisciplinary point of view. Its scope includes Predicative Foundations, Constructive Mathematics and Type Theory, Computation in Higher Types, Extraction of Programs from Proofs, and Algorithmic Aspects in Financial Mathematics. By filling the gap between (under-)graduate level textbooks and advanced research papers, the book gives a scholarly account of recent developments and emerging branches of the aforementioned fields.
In the 21st century, digitalization is a global challenge of mankind. Even for the public, it is obvious that our world is increasingly dominated by powerful algorithms and big data. But, how computable is our world? Some people believe that successful problem solving in science, technology, and economies only depends on fast algorithms and data mining. Chances and risks are often not understood, because the foundations of algorithms and information systems are not studied rigorously. Actually, they are deeply rooted in logics, mathematics, computer science and philosophy.Therefore, this book studies the foundations of mathematics, computer science, and philosophy, in order to guarantee security and reliability of the knowledge by constructive proofs, proof mining and program extraction. We start with the basics of computability theory, proof theory, and information theory. In a second step, we introduce new concepts of information and computing systems, in order to overcome the gap between the digital world of logical programming and the analog world of real computing in mathematics and science. The book also considers consequences for digital and analog physics, computational neuroscience, financial mathematics, and the Internet of Things (IoT).
Die Quantenwelt ist langst im Alltag angekommen, ohne dass es vielen bewusst ist. Dazu gehoeren Transistoren, Dioden und Laser, die aus Alltagsgeraten nicht mehr fortzudenken sind. Nach dieser ersten Generation der Quantentechnologien leben wir derzeit in der zweiten Generation, in der Grundprinzipien der Quantenmechanik gezielt in quantenmechanischen Geraten umgesetzt werden. Dazu gehoeren erste Prototypen von Quantencomputern, klassische Supercomputer mit Quantensimulation, Quantenkryptographie und Quantenkommunikation, Quantensensorik und Quantenmesstechnik. Was Einstein 1935 als spukhafter Effekt vorkam, ist langst Grundlage umwalzender Quantenkommunikation in Glasfasernetzen und Satellitentechnik, die ein zukunftiges Quanteninternet ankundigt. Quantencomputer als Mehrzweckrechner sind nur die Spitze des Eisbergs mit einer Technologie, die sich schrittweise als Netzwerk unserer Zivilisation ausbreitet. Umso dringender ist es, die Grundlagen der Quantenwelt als Hintergrund dieser Technologie zu verstehen. Grundlagen und Zusammenhange begreifen, von den mathematischen und physikalischen Grundlagen bis zu den technischen Anwendungen, ist ein zentrales Ziel des Buchs. Ein weiteres Anliegen dieses Buchs ist das Zusammenwachsen mit der Kunstlichen Intelligenz. In meinem Buch "Kunstliche Intelligenz. Wann ubernehmen die Maschinen?" (Springer 2. Aufl. 2019) wird Machine learning herausgestellt, das Automatisierung in Robotik, Industrie- und Arbeitswelt verwirklicht. Mit Quantentechnologie, Quantencomputer und kunstlicher Intelligenz zeichnet sich aber nicht nur eine Potenzierung neuer Moeglichkeiten ab, sondern auch von Gefahrdungen. Daher erhebt sich die Forderung nach fruhzeitiger Technikgestaltung, damit Quantentechnologie und Kunstliche Intelligenz sich als Dienstleistung in der Gesellschaft bewahren.
Here, expert authors delineate approaches that can support both decision makers as well as their concerned populations in overcoming unwarranted fears and in elaborating policies based on scientific evidence. Four exemplary focus areas were chosen for in-depth review, namely:- The scientific basis of risk management- Risk management in the area of environmental and ecological policy- Risk management in radiation medicine- Risk management in context with digitalization and roboticsGeneral as well as specific recommendations are summarized in a memorandum. Fundamental thoughts on the topic are presented in the introductory part of the book. The idea for and contents of the book were developed at a workshop on "Sustainable Risk Management: How to manage risks in a sensible and responsible manner?" held in Feldafing at Lake Starnberg (Germany) on April 14 to 16, 2016. The book offers important information and advice for scientists, entrepreneurs, administrators and politicians.
This new edition also treats smart materials and artificial life. A new chapter on information and computational dynamics takes up many recent discussions in the community.
Time is fundamental to our experience, but remains mysterious. This book shows how philosophers and scientists have tried to grapple with this most extraordinary of ordinary phenomena. From the attempts of early astronomers to reconcile solar and lunar and terrestrial reckonings, to the huge expansions and contractions of time consciousness brought on by scientists as diverse as Newton, Darwin, and Einstein, this book shows how time is as much a matter of human choice as it is a matter of scientific precision.
This is a book about numbers - all kinds of numbers, from integers to p-adics, from rationals to octonions, from reals to infinitesimals. Who first used the standard notation for Â? Why was Hamilton obsessed with quaternions? What was the prospect for "quaternionic analysis" in the 19th century? This is the story about one of the major threads of mathematics over thousands of years. It is a story that will give the reader both a glimpse of the mystery surrounding imaginary numbers in the 17th century and also a view of some major developments in the 20th.
Everybody knows them. Smartphones that talk to us, wristwatches that record our health data, workflows that organize themselves automatically, cars, airplanes and drones that control themselves, traffic and energy systems with autonomous logistics or robots that explore distant planets are technical examples of a networked world of intelligent systems. Machine learning is dramatically changing our civilization. We rely more and more on efficient algorithms, because otherwise we will not be able to cope with the complexity of our civilizing infrastructure. But how secure are AI algorithms? This challenge is taken up in the 2nd edition: Complex neural networks are fed and trained with huge amounts of data (big data). The number of necessary parameters explodes exponentially. Nobody knows exactly what is going on in these black boxes. In machine learning we need more explainability and accountability of causes and effects in order to be able to decide ethical and legal questions of responsibility (e.g. in autonomous driving or medicine)! Besides causal learning, we also analyze procedures of tests and verification to get certified AI-programs. Since its inception, AI research has been associated with great visions of the future of mankind. It is already a key technology that will decide the global competition of social systems. Artificial Intelligence and Responsibility is another central supplement to the 2nd edition: How should we secure our individual liberty rights in the AI world? This book is a plea for technology design: AI must prove itself as a service in society.
This Brief is an essay at the interface of philosophy and complexity research, trying to inspire the reader with new ideas and new conceptual developments of cellular automata. Going beyond the numerical experiments of Steven Wolfram, it is argued that cellular automata must be considered complex dynamical systems in their own right, requiring appropriate analytical models in order to find precise answers and predictions in the universe of cellular automata. Indeed, eventually we have to ask whether cellular automata can be considered models of the real world and, conversely, whether there are limits to our modern approach of attributing the world, and the universe for that matter, essentially a digital reality.
Kunstliche Intelligenz ist eine Schlusseltechnologie, mit der sowohl in der Wissenschaft als auch in der Industrie grosse Erwartungen verbunden sind. In diesem Buch werden sowohl die Perspektiven als auch die Grenzen dieser Technologie diskutiert. Das betrifft die praktischen, theoretischen und konzeptionellen Herausforderungen, denen sich die KI stellen muss. In einer Fruhphase standen in der KI Expertensysteme im Vordergrund, bei denen mit Hilfe symbolischer Datenverarbeitung regelbasiertes Wissen verarbeitet wurde. Heute wird die KI von statistik-basierten Methoden im Bereich des maschinellen Lernens beherrscht. Diese subsymbolische KI wird an den Lehren, die aus der Fruhphase der KI gezogen werden koennen, gemessen. Als Ergebnis wird vor allem fur eine hybride KI argumentiert, die die Potentiale beider Ansatze zur Entfaltung bringen kann.
Das Buch gibt einen faszinierenden Einblick in die Grundlagen virtueller Netzwelten. Computergest tzte Informations- und Kommunikationsnetze sind die treibenden Kr fte einer Entwicklung zur Wissensgesellschaft. Sie erzeugen virtuelle Netzwelten, in denen wir unser Wissen speichern, Innovationen planen, Gesch fte t tigen, Kunst und Unterhaltung suchen. Wie ver ndern sich dadurch Forschung und Lehre in Technik, Natur-, Wirtschafts-, Sozial- und Kulturwissenschaften? Was hei t virtuelles Leben und Biocomputing? Wissensmanagement in komplexen Netzen bedarf der Hilfe autonomer, mobiler und intelligenter Softwareagenten. Bei aller Faszination und Zukunftsvision, die von der Globalisierung virtueller Netzwelten ausgeht - Ziel sollte die humane Dienstleistung in der Wissensgesellschaft sein.
Seit den Anfangen in der 50er Jahren haben die Forschungen zum Thema Computer und Gehirn einen beispiellosen Aufschwung erlebt. In der letzten Dekade unseres Jahrhunderts scheint das Gehirn zum Paradigma neuartiger lernfahiger und selbstorganisierender Systeme zu werden, die mit den programmierbaren Rechenmaschinen von einst nur noch bedingte Ahnlichkeit aufweisen. Kann man die Funktionsweise des menschlichen Gehirns uberhaupt mit der eines Computers vergleichen? Was wissen wir uber das Leben und lasst sich dieses Wissen fur die Konstruktion neuer, leistungsfahiger "intelligenter" Systeme nutzen? Klaus Mainzer gibt einen faszinierenden Einblick in die Komplexitat des Denkens und diskutiert die Moglichkeiten und Grenzen kunstlicher Intelligenz und kunstlichen Lebens."
Komplexe dynamische Systeme werden in verschiedenen Wissenschaftsdisziplinen untersucht - von Physik, Chemie, Biologie und Medizin uber Kognitionswissenschaften und Psychologie bis zu Soziologie und OEkonomie. Dieser Band zieht Bilanz und zeigt kunftige Forschungsperspektiven auf. Gemeinsames methodisches Thema ist die Modellierung komplexer Systeme, deren Dynamik durch Nichtlinearitat bestimmt ist. Mathematische Methoden und Computersimulationen machen aber nur Sinn, wenn sie mit konkreten einzelwissenschaftlichen Analysen verbunden sind. Das Buch liefert nicht nur fachubergreifende Informationen uber den aktuellen Forschungsstand, sondern kann auch von Nichtspezialisten als Einfuhrung in das spannende Gebiet komplexer Systeme und nichtlinearer Dynamik in Natur und Gesellschaft gelesen werden.
Die Schwierigkeit Mathematik zu lernen und zu lehren ist jedem bekannt, der einmal mit diesem Fach in Beruhrung gekommen ist. Begriffe wie "reelle oder komplexe Zahlen, Pi" sind zwar jedem gelaufig, aber nur wenige wissen, was sich wirklich dahinter verbirgt. Die Autoren dieses Bandes geben jedem, der mehr wissen will als nur die Hulle der Begriffe, eine meisterhafte Einfuhrung in die Magie der Mathematik und schlagen einzigartige Brucken fur Studenten. Die Rezensenten der ersten beiden Auflagen uberschlugen sich."
Digitalization has transformed the discourse of architecture: that discourse is now defined by a wealth of new terms and concepts that previously either had no meaning, or had different meanings, in the context of architectural theory and design. Its concepts and strategies are increasingly shaped by influences emerging at the intersection with scientific and cultural notions from modern information technology. The new series Context Architecture seeks to take a critical selection of concepts that play a vital role in the current discourse and put them up for discussion. When Vitruvius described the architect as a "uomo universale," he gave rise to the architecta (TM)s conception of him- or herself as a generalist who shapes a complex reality. The architectural concept of complexity, however, failed to keep pace with industrial and social reality, becoming instead an increasingly formal and superficial notion that could ultimately be applied to almost anything. Against it, architectural modernism set the watchword of simplification: "less is more." In this situation, Robert Venturi reintroduced the notion of complexity into architectural discourse: his goal was not just to restore the complexity of architectonic forms and their history but also to explore the concrete reality of the existing built environment. Today it is complexity studies, with their starting point in physics, that define the current approach to the concept of complexity. They have established a new connection between the natural sciences and information technology and have thus become a central premise of computer-based approaches to design.
Digitalization has transformed the discourse of architecture: that discourse is now defined by a wealth of new terms and concepts that previously either had no meaning, or had different meanings, in the context of architectural theory and design. Its concepts and strategies are increasingly shaped by influences emerging at the intersection with scientific and cultural notions from modern information technology. The series Context Architecture seeks to take a critical selection of concepts that play a vital role in the current discourse and put them up for discussion. When Vitruvius described the architect as a "uomo universale," he gave rise to the architect's conception of him- or herself as a generalist who shapes a complex reality. The architectural concept of complexity, however, failed to keep pace with industrial and social reality, becoming instead an increasingly formal and superficial notion that could ultimately be applied to almost anything. Against it, architectural modernism set the watchword of simplification: "less is more." In this situation, Robert Venturi reintroduced the notion of complexity into architectural discourse: his goal was not just to restore the complexity of architectonic forms and their history but also to explore the concrete reality of the existing built environment. Today it is complexity studies, with their starting point in physics, that define the current approach to the concept of complexity. They have established a new connection between the natural sciences and information technology and have thus become a central premise of computer-based approaches to design.
|
You may like...
|