|
Showing 1 - 25 of
32 matches in All Departments
The nationwide research project Deduktion', funded by the Deutsche
Forschungsgemeinschaft (DFG)' for a period of six years, brought
together almost all research groups within Germany engaged in the
field of automated reasoning. Intensive cooperation and exchange of
ideas led to considerable progress both in the theoretical
foundations and in the application of deductive knowledge. This
three-volume book covers these original contributions moulded into
the state of the art of automated deduction. The three volumes are
intended to document and advance a development in the field of
automated deduction that can now be observed all over the world.
Rather than restricting the interest to purely academic research,
the focus now is on the investigation of problems derived from
realistic applications. In fact industrial applications are already
pursued on a trial basis. In consequence the emphasis of the
volumes is not on the presentation of the theoretical foundations
of logical deduction as such, as in a handbook; rather the books
present the concepts and methods now available in automated
deduction in a form which can be easily accessed by scientists
working in applications outside of the field of deduction. This
reflects the strong conviction that automated deduction is on the
verge of being fully included in the evolution of technology.
Volume I focuses on basic research in deduction and on the
knowledge on which modern deductive systems are based. Volume II
presents techniques of implementation and details about system
building. Volume III deals with applications of deductive
techniques mainly, but not exclusively, to mathematics and the
verification of software. Each chapter was read bytwo referees, one
an international expert from abroad and the other a knowledgeable
participant in the national project. It has been accepted for
inclusion on the basis of these review reports. Audience:
Researchers and developers in software engineering, formal methods,
certification, verification, validation, specification of complex
systems and software, expert systems, natural language processing.
The nationwide research project Deduktion', funded by the Deutsche
Forschungsgemeinschaft (DFG)' for a period of six years, brought
together almost all research groups within Germany engaged in the
field of automated reasoning. Intensive cooperation and exchange of
ideas led to considerable progress both in the theoretical
foundations and in the application of deductive knowledge. This
three-volume book covers these original contributions moulded into
the state of the art of automated deduction. The three volumes are
intended to document and advance a development in the field of
automated deduction that can now be observed all over the world.
Rather than restricting the interest to purely academic research,
the focus now is on the investigation of problems derived from
realistic applications. In fact industrial applications are already
pursued on a trial basis. In consequence the emphasis of the
volumes is not on the presentation of the theoretical foundations
of logical deduction as such, as in a handbook; rather the books
present the concepts and methods now available in automated
deduction in a form which can be easily accessed by scientists
working in applications outside of the field of deduction. This
reflects the strong conviction that automated deduction is on the
verge of being fully included in the evolution of technology.
Volume I focuses on basic research in deduction and on the
knowledge on which modern deductive systems are based. Volume II
presents techniques of implementation and details about system
building. Volume III deals with applications of deductive
techniques mainly, but not exclusively, to mathematics and the
verification of software. Each chapter was read bytwo referees, one
an international expert from abroad and the other a knowledgeable
participant in the national project. It has been accepted for
inclusion on the basis of these review reports. Audience:
Researchers and developers in software engineering, formal methods,
certification, verification, validation, specification of complex
systems and software, expert systems, natural language processing.
The nationwide research project `Deduktion', funded by the
`Deutsche Forschungsgemeinschaft (DFG)' for a period of six years,
brought together almost all research groups within Germany engaged
in the field of automated reasoning. Intensive cooperation and
exchange of ideas led to considerable progress both in the
theoretical foundations and in the application of deductive
knowledge. This three-volume book covers these original
contributions moulded into the state of the art of automated
deduction. The three volumes are intended to document and advance a
development in the field of automated deduction that can now be
observed all over the world. Rather than restricting the interest
to purely academic research, the focus now is on the investigation
of problems derived from realistic applications. In fact industrial
applications are already pursued on a trial basis. In consequence
the emphasis of the volumes is not on the presentation of the
theoretical foundations of logical deduction as such, as in a
handbook; rather the books present the concepts and methods now
available in automated deduction in a form which can be easily
accessed by scientists working in applications outside of the field
of deduction. This reflects the strong conviction that automated
deduction is on the verge of being fully included in the evolution
of technology. Volume I focuses on basic research in deduction and
on the knowledge on which modern deductive systems are based.
Volume II presents techniques of implementation and details about
system building. Volume III deals with applications of deductive
techniques mainly, but not exclusively, to mathematics and the
verification of software. Each chapter was read by two referees,
one an international expert from abroad and the other a
knowledgeable participant in the national project. It has been
accepted for inclusion on the basis of these review reports.
Audience: Researchers and developers in software engineering,
formal methods, certification, verification, validation,
specification of complex systems and software, expert systems,
natural language processing.
This book is for both developer and decision makers of R/3
implementation teams who need to understand in-depth and
practically the benefits, financial risks and technical backgrounds
of IDocs and ALE in interface development. It describes the
implementation of interfaces in an R/3 roll-out, imporatnt
technologies such as RFC, OLE and Workflow and common standards
like EDIFACT, ANSI X.12 or XML. A large number of recipes deliver
templates as a starting point for own enhancements. It is for
everybody who depends on fast and cost-effective solutions for EDI
and it also discusses why many EDI projects are ten times as
expensive as they could be. Preparing the reader with the essential
knowledge to survive the outrageously fast growing world of data
communication and ecommerce via internet and intranet, the book
shows in a destilled manner how enterprises using R/3 can
efficiently implement Electronic Data Interchange (EDI) both with
external partner and with inhouse satellite systems. This book in
the tradition of IT-cookbooks, where the reader will find quick
recipes and reliable information to cover all aspects of SAP
Interfacing and quickly became a standard work for the R/3 world.
The nationwide research project Deduktion', funded by the Deutsche
Forschungsgemeinschaft (DFG)' for a period of six years, brought
together almost all research groups within Germany engaged in the
field of automated reasoning. Intensive cooperation and exchange of
ideas led to considerable progress both in the theoretical
foundations and in the application of deductive knowledge. This
three-volume book covers these original contributions moulded into
the state of the art of automated deduction. The three volumes are
intended to document and advance a development in the field of
automated deduction that can now be observed all over the world.
Rather than restricting the interest to purely academic research,
the focus now is on the investigation of problems derived from
realistic applications. In fact industrial applications are already
pursued on a trial basis. In consequence the emphasis of the
volumes is not on the presentation of the theoretical foundations
of logical deduction as such, as in a handbook; rather the books
present the concepts and methods now available in automated
deduction in a form which can be easily accessed by scientists
working in applications outside of the field of deduction. This
reflects the strong conviction that automated deduction is on the
verge of being fully included in the evolution of technology.
Volume I focuses on basic research in deduction and on the
knowledge on which modern deductive systems are based. Volume II
presents techniques of implementation and details about system
building. Volume III deals with applications of deductive
techniques mainly, but not exclusively, to mathematics and the
verification of software. Each chapter was read by two referees,
one an international expert from abroad and the other a
knowledgeable participant in the national project. It has been
accepted for inclusion on the basis of these review reports.
Audience: Researchers and developers in software engineering,
formal methods, certification, verification, validation,
specification of complex systems and software, expert systems,
natural language processing.
1. BASIC CONCEPTS OF INTERACTIVE THEOREM PROVING Interactive
Theorem Proving ultimately aims at the construction of powerful
reasoning tools that let us (computer scientists) prove things we
cannot prove without the tools, and the tools cannot prove without
us. Interaction typi cally is needed, for example, to direct and
control the reasoning, to speculate or generalize strategic lemmas,
and sometimes simply because the conjec ture to be proved does not
hold. In software verification, for example, correct versions of
specifications and programs typically are obtained only after a
number of failed proof attempts and subsequent error corrections.
Different interactive theorem provers may actually look quite
different: They may support different logics (first-or
higher-order, logics of programs, type theory etc.), may be generic
or special-purpose tools, or may be tar geted to different
applications. Nevertheless, they share common concepts and
paradigms (e.g. architectural design, tactics, tactical reasoning
etc.). The aim of this chapter is to describe the common concepts,
design principles, and basic requirements of interactive theorem
provers, and to explore the band width of variations. Having a
'person in the loop', strongly influences the design of the proof
tool: proofs must remain comprehensible, - proof rules must be
high-level and human-oriented, - persistent proof presentation and
visualization becomes very important."
We are invited to deal with mathematical activity in a sys tematic
way [ ... ] one does expect and look for pleasant surprises in this
requirement of a novel combination of psy chology, logic,
mathematics and technology. Hao Wang, 1970, quoted from(Wang,
1970). The field of mathematics has been a key application area for
automated theorem proving from the start, in fact the very first
automatically found the orem was that the sum of two even numbers
is even (Davis, 1983). The field of automated deduction has
witnessed considerable progress and in the last decade, automated
deduction methods have made their way into many areas of research
and product development in computer science. For instance,
deduction systems are increasingly used in software and hardware
verification to ensure the correctness of computer hardware and
computer programs with respect to a given specification. Logic
programming, while still falling somewhat short of its
expectations, is now widely used, deduc tive databases are
well-developed and logic-based description and analysis of hard-and
software is commonplace today.
Since both the coments and the structure of the book appeared to be
successful, only minor changes were made. In particular, some
recent work in ATP has been incorporated so that the book continues
to reflect the state of the art in the field. The most significant
change is in the quality of the layout including the removal of a
number of inaccuracies and typing errors. R. Caferra, E. Eder, F.
van der Linden, and J. Muller have caught vanous minor errors. P.
Haddawy and S.T. Pope have provided many stilistic improvements of
the English text. Last not least, A. Bentrup and W. Fischer have
produced the beautiful layout. The extensive work of typesetting
was financally supported within ESPRIT pro ject 415. Munchen,
September 1986 W. Bibel PREFACE Among the dreams of mankind is the
one dealing with the mechanization of human thought. As the world
today has become so complex that humans apparently fail to manage
it properly with their intellectual gifts, the realization of this
dream might be regarded even as something like a necessity. On the
other hand, the incredi ble advances in computer technology let it
appear as a real possibility."
This volume contains the elaborated and harmonized versions of
seven lectures given at the first Advanced Course in Artificial
Intelligence, held in Vignieu, France, in July 1985. Most of them
were written in tutorial form; the book thus provides an extremely
valuable guide to the fundamental aspects of AI. In the first part,
Delgrande and Mylopoulos discuss the concept of knowledge and its
representation. The second part is devoted to the processing of
knowledge. The contribution by Huet shows that both computation and
inference or deduction are just different aspects of the same
phenomenon. The chapter written by Stickel gives a thorough and
knowledgeable introduction to the most important aspects of
deduction by some form of resolution. The kind of reasoning that is
involved in inductive inference problem solving (or programming)
from examples, and in learning, is covered by Biermann. The
tutorial by Bibel covers the more important forms of knowledge
processing that might play a significant role in common sense
reasoning. The third part of the book focuses on logic programming
and functional programming. Jorrand presents the language FP2,
where term rewriting forms the basis for the semantics of both
functional and parallel programming. In the last chapter, Shapiro
gives an overview of the current state of concurrent PROLOG.
This volume contains the elaborated and harmonized versions of
seven lectures given at the first Advanced Course in Artificial
Intelligence, held in Vignieu, France, in July 1985. Most of them
were written in tutorial form; the book thus provides an extremely
valuable guide to the fundamental aspects of AI. In the first part,
Delgrande and Mylopoulos discuss the concept of knowledge and its
representation. The second part is devoted to the processing of
knowledge. The contribution by Huet shows that both computation and
inference or deduction are just different aspects of the same
phenomenon. The chapter written by Stickel gives a thorough and
knowledgeable introduction to the most important aspects of
deduction by some form of resolution. The kind of reasoning that is
involved in inductive inference problem solving (or programming)
from examples, and in learning, is covered by Biermann. The
tutorial by Bibel covers the more important forms of knowledge
processing that might play a significant role in common sense
reasoning. The third part of the book focuses on logic programming
and functional programming. Jorrand presents the language FP2,
where term rewriting forms the basis for the semantics of both
functional and parallel programming. In the last chapter, Shapiro
gives an overview of the current state of concurrent PROLOG.
This book contains a selection of revised papers and
state-of-the-art overviews on current trends and future
perspectives of fuzzy systems. A major aim is to address
theoretical as well as application-oriented issues and to
contribute to the foundation of concepts, methods, and tools in
this field. The book is written by researchers who attended the
workshop "Fuzzy Systems '93 - Management of Uncertain Information"
(Braunschweig, Germany, October 21-22, 1993), organized by the
German Society of Computer Science (GI), the German Computer
Science Academy (DIA), and the University of Braunschweig.Dieses
Buch enthalt ausgewahlte und auf neuesten Stand gebrachte
Fachaufsatze und "State of the Art"-Ubersichtsartikel in englischer
Sprache. Sie geben einen Uberblick uber aktuelle Trends sowie
Zukunftsperspektiven der Fuzzy-Systeme. Besonderer Wert wird darauf
gelegt, dass das Buch in einem ausgewogenen Verhaltnis von Theorie
und Praxis zur Fundierung von Konzepten, Methoden und Werkzeugen
beitragt. Hervorgegangen ist das Werk aus einem von der
Gesellschaft fur Informatik (GI), der Deutschen Informatik Akademie
(DIA) und der TU Braunschweig gemeinsam veranstalteten GI-Workshop
"Fuzzy-Systeme '93 - Management unsicherer Informationen"
(Braunschweig, 21.-22.10.1993). Die Aufsatze wurden uberarbeitet
und um Uberblicksartikel erganzt, geschrieben von H. J. Zimmermann,
H. Hellendorn, D. Nauck, C. Freksa, S. Gottwald und K. D.
Meyer-Gramann.
The Computational Brain, das aussergewohnliche Buch uber
vergleichende Forschung in den Bereichen von menschlichem Gehirn
und neuesten Moglichkeiten der Computertechnologie, liegt hiermit
erstmals in deutscher Sprache vor. Geschrieben von einem fuhrenden
Forscherteam in den USA, ist es eine Fundgrube fur alle, die wissen
wollen, was der Stand der Wissenschaft auf diesem Gebiet ist. Die
Autoren fuhren die Bereiche der Neuroinformatik und Neurobiologie
mit gut ausgesuchten Beispielen und der gebotenen
Hintergrundinformation gekonnt zusammen. Das Buch wird somit nicht
nur dem Fachwissenschaftler sondern auch dem interdisziplinaren
Interesse des Informatikers und des Biologen auf eine hervorragende
Weise gerecht.
Ubersetzt wurde das Buch von Prof. Dr. Steffen Holldobler und
Dipl.-Biol. Claudia Holldobler, einem Informatiker und einer
Biologin.
Rezension in Spektrum der Wissenschaft
nr. 10, S. 122 f. im Oktober 1997
(...) Die 1992 erschienene amerikanische Originalausgabe des
vorliegenden Werkes ist so erfolgreich, dass man bereits von einem
Klassiker reden kann. (...)
(...) ....ist das Buch sehr zu empfehlen. In Verbindung von
Neurobiologie und Neuroinformatik
konkurrenzlos, vermittelt es einiges von der Faszination
theoretischer Hirnforschung, die auch in Deutschland zunehmend mehr
Wissenschaftler in ihren Bann schlagt.
Rezension erschienen in: Computer Spektrum 3/1997, S. 2
(...)Das Buch wird somit nicht nur dem Fachwissenschaftler,
sondern auch den interdisziplinaren Interesse des Informatikers und
des Biologen auf eine hervorragende Weise gerecht(...)"
Das Handbuch deckt alle Facetten des Web Mining ab. Zunachst wird
der Prozess des Web Mining ausfuhrlich beschrieben, wobei
insbesondere auf den Aspekt des Preprocessing der
internetspezifischen Daten eingegangen wird. Besonderer Wert wird
auf die zahlreichen Einsatzpotenziale des Web Mining gelegt, wobei
grundsatzliche Uberlegungen mit den Ergebnissen bereits
realisierter Projekte erganzt werden.
"
Die Essays behandeln privates Erfahrungswissen und
wissenschaftliche Erkenntnisse zu den Themenbereichen Korper,
Geist, Seele, zwischenmenschliche Beziehungen,
gesellschaftliche, wirtschaftliche und politische Strukturen sowie
Wissenschaft, Religion
und Kunst.
"
Das Buch stellt anhand von Praxisfallen dar, wie mit Hilfe der
Verfahren des Data Mining und der Business Intelligence
Verhaltensmuster und Wissen in grossen Datenbestanden entdeckt
werden konnen. Es geht beispielsweise um Kundensegmentierung,
Bonitatsprufung oder Werbetragerplanung in Branchen wie
Versandhandel, Versicherung, Einzelhandel oder Telekommunikation.
Das Buch behandelt die wichtigsten Methoden zur Erkennung und
Extraktion von "Wissen" aus numerischen und nichtnumerischen
Datenbanken in Technik und Wirtschaft. Hierzu gehoren Algorithmen
zur Vorverarbeitung, Aufbereitung, Visualisierung und Analyse von
Daten. Neben linearen statistischen Methoden werden moderne
Verfahren aus den Gebieten Clusteranalyse, Fuzzy-Logik,
Neuroinformatik, maschinelles Lernen, Entscheidungsbaume und
Agentensysteme vorgestellt.
Sehen ist die Ermittlung von Informationen aus Bildern. Welche
Informationsquellen dabei genutzt werden und wie die Auswertung im
Einzelnen vorgenommen werden kann, ist Gegenstand dieses
einfuhrenden Lehrbuches. Es behandelt die sogenannte
Kompetenztheorie des Sehens fur die elementaren Wahrnehmungen, wie
Kontrast, Farbe, Tiefe und Bewegung. Als visuell gesteuerte
Verhaltensleistungen werden Augenbewegungen und die Navigation
behandelt. Technisches Sehen (Computer vision) und die
Wahrnehmungsmechanismen des Menschen werden wo immer moglich
gemeinsam und vergleichend dargestellt. Die verwendeten
mathematischen Verfahren werden im Text eingefuhrt und erlautert;
ein Glossar wesentlicher Begriffe erleichtert das Verstandnis."
Alle Prozesse in der Natur enthalten eine oder mehrere ungewisse
Komponenten, zeigen Ungewissheiten oder haben einen mehr oder
weniger ungewissen Ausgang. Dabei kann man unterscheiden, ob man
einen Vorgang -oder einen Teil davon -als ungewiss ansieht, weil
man ihn nicht exakt deterministisch erfassen kann (z. B. die
Kursentwicklung an einer Wertpapierboerse), ob man ihn als genuin
zufallig ansieht (z. B. den radioaktiven Zerfall eines Stoffes)
oder ob die Ungewissheit des Vorgangs von seiner Beschreibung mit
vagen Begriffen herruhrt. Unsere heutigen sehr kom- plexen sozialen
und technischen Strukturen sind ohne den Einsatz von Verfahren zur
Behandlung ungewisser Effekte nicht mehr vorstellbar, wenn man z.
B. nur an Lebens-und Krankenversicherungen einerseits und an die
Berechnung der Zu- verlassigkeit technischer Systeme und Prozesse
andererseits denkt. Die Entwicklung mathematischer Werkzeuge zur
Wahrscheinlichkeitsrechnung und Statistik fuhrte zu der bis in
unser Jahrhundert unangefochtenen Stellung der Stochastik als der
besten wissenschaftlichen Methode zur Behandlung von Aspekten der
Ungewissheit. In der zweiten Halfte des 20. Jahrhunderts etablierte
sich dann die Fuzzy Theorie, die Lotfi Zadeh in der Arbeit Fuzzy
Sets (1965) als Verallgemeinerung der Can- torschen Mengentheorie
begrundete, als eine ernstzunehmende Konkurrentin fur die Aufgabe,
Ungewissheiten zu modellieren. Die weiteren Entwicklungen brachten
eine uber Jahrzehnte gefuhrte Auseinandersetzung zwischen
Stochastikern und Vertre- tern der Fuzzy Theorie, aber auch eine
uberaus erfolgreiche Anwendung der Theorie in vielen Bereichen der
angewandten Wissenschaften und der Industrie.
Kunstliche Neuronale Netze, die Fuzzy Set Theorie und Evolutionare
Algorithmen werden als innovative und komplementare
Problemlosungsansatze heute unter dem Begriff Soft Computing oder
auch Computational Intelligence zusammengefasst. Das Buch bietet
eine kompakte, gut verstandliche Einfuhrung in die Thematik und
dokumentiert aktuelle betriebswirtschaftliche Anwendungen und
Forschungsprojekte des Soft Computing. Damit erschliesst sich
Innovationspotential, das zu Kostenvorteilen und
Effizienzsteigerungen fuhren kann. Zahlreiche Abbildungen erganzen
den Text. Das Buch kann als Grundlage zur Entwicklung eigener
Anwendungen dienen oder als begleitender Text fur
Lehrveranstaltungen.
Die Gesellschaft fur Informatik (GI) zeichnet jedes Jahr eine
Informatikdisser tation durch einen Preis aus. Die Auswahl dieser
Dissertation stutzt sich auf die von den Universitaten und
Hochschulen fur diesen Preis vorgeschlagenen Dissertationen. Somit
sind die Teilnehmer an dem Auswahlverfahren der GI bereits als
"Preistrager" ihrer Hochschule ausgezeichnet. Der Ausschuss der GI,
der den Preistrager aus der Reihe der vorgeschlagenen Kandidaten
nominiert, veranstaltete in Raumen der Akademie der Wissen schaften
und Literatur Mainzein Kolloquium, das den Kandidaten Gelegenheit
bot, ihre Resultate im Kreis der Mitbewerber vorzustellen und zu
verteidigen. Der Ausschuss war von dem hohen Niveau der
eingereichten Arbeiten und der Prasentation sehr positiv
beeindruckt. Die Teilnehmer begrussten die Veran staltung des
Kolloquiums sehr, nahmen an der Diskussion teil und schatzten die
Moglichkeit, mit den Teilnehmern aus anderen Hochschulen ins
Gesprach zu kommen. Zu dem Erfolg des Kolloquiums trug auch die
grosszugige Gast freundschaft der Akademie bei, der hier dafur auch
gedankt sei. Es fiel dem Ausschuss schwer, unter den nach dem
Kolloquium in die engere Wahl genommenen Kandidaten den Preistrager
zu bestimmen. Die Publikation der hier prasentierten Kurzfassungen
gleicht die Ungerechtigkeit der Auswahl eines Kandidaten unter
mehreren ebenburtigen Kandidaten etwas aus."
Neuronale Netze sind in den letzten Jahren Gegenstand intensiver
Forschungen gewesen. Dieses Buch verbindet die Darstellung neuester
Ergebnisse aus dem Bereich der Lernverfahren mit
anwendungsbezogenen Aspekten. Es werden methodische Prinzipien der
Erstellung von Softwaresystemen, die auf konnektionistischen
Verfahren basieren, herausgearbeitet. Fallbeispiele aus
unterschiedlichen Anwendungsdomanen zeigen die vielfaltigen
Einsatzmoglichkeiten fur Neuronale Netze.
Dieses Buch ist das Standardwerk zu einem neuen Bereich der
angewandten Fuzzy-Technologie, der Fuzzy-Clusteranalyse. Diese
beinhaltet Verfahren der Mustererkennung zur Gruppierung und
Strukturierung von Daten. Dabei werden im Gegensatz zu klassischen
Clustering-Techniken die Daten nicht eindeutig zu Klassen
zugeordnet, sondern Zugehorigkeitsgrade bestimmt, so dass die
Fuzzy-Verfahren robust gegenuber gestorten oder verrauschten Daten
sind und fliessende Klassenubergange handhaben konnen.
Dieses Werk gibt eine methodische Einfuhrung in die zahlreichen
Fuzzy-Clustering-Algorithmen mit ihren Anwendungen in den Bereichen
Datenanalyse, Erzeugung von Regeln fur Fuzzy-Regler,
Klassifikations- und Approximationsprobleme sowie eine ausfuhrliche
Darstellung des Shell-Clustering zur Erkennung von geometrischen
Konturen in Bildern."
Dieses Lehrbuch aus dem KI-Themenfeld richtet sich an
Wirtschaftsinformatiker und Informatiker, ferner an Ingenieure und
OR-Spezialisten. Es bietet eine umfassende methodisch orientierte
Einfuhrung in das Optimieren mit Evolutionaren Algorithmen. Dazu
gehoren vor allem Genetische Algorithmen, Evolutionsstrategien,
Genetische bzw. Evolutionare Programmierung. Wichtige Ergebnisse
der Theorie werden in gut verstandlicher Form wiedergegeben.
Zahlreiche Abbildungen und Beispiele sowie Hinweise auf Quellen im
Internet und Testdaten erganzen den Text. Das Buch kann als
Grundlage zur Entwicklung eigener Anwendungen dienen oder als
begleitender Text fur Lehrveranstaltungen."
|
You may like...
Loot
Nadine Gordimer
Paperback
(2)
R383
R318
Discovery Miles 3 180
Loot
Nadine Gordimer
Paperback
(2)
R383
R318
Discovery Miles 3 180
Fast X
Vin Diesel, Jason Momoa, …
DVD
R172
R132
Discovery Miles 1 320
Loot
Nadine Gordimer
Paperback
(2)
R383
R318
Discovery Miles 3 180
|