![]() |
![]() |
Your cart is empty |
||
Books > Science & Mathematics > Mathematics > Mathematical foundations > General
This book constitutes the refereed proceedings of the International Symposium on Logical Foundations of Computer Science, LFCS 2013, held in San Diego, CA, USA in January 2013. The volume presents 29 revised refereed papers carefully selected by the program committee. The scope of the Symposium is broad and includes constructive mathematics and type theory; logic, automata and automatic structures; computability and randomness; logical foundations of programming; logical aspects of computational complexity; logic programming and constraints; automated deduction and interactive theorem proving; logical methods in protocol and program verification; logical methods in program specification and extraction; domain theory logic; logical foundations of database theory; equational logic and term rewriting; lambda and combinatory calculi; categorical logic and topological semantics; linear logic; epistemic and temporal logics; intelligent and multiple agent system logics; logics of proof and justification; nonmonotonic reasoning; logic in game theory and social software; logic of hybrid systems; distributed system logics; mathematical fuzzy logic; system design logics; and other logics in computer science.
Approximate reasoning is a key motivation in fuzzy sets and possibility theory. This volume provides a coherent view of this field, and its impact on database research and information retrieval. First, the semantic foundations of approximate reasoning are presented. Special emphasis is given to the representation of fuzzy rules and specialized types of approximate reasoning. Then syntactic aspects of approximate reasoning are surveyed and the algebraic underpinnings of fuzzy consequence relations are presented and explained. The second part of the book is devoted to inductive and neuro-fuzzy methods for learning fuzzy rules. It also contains new material on the application of possibility theory to data fusion. The last part of the book surveys the growing literature on fuzzy information systems. Each chapter contains extensive bibliographical material. Fuzzy Sets in Approximate Reasoning and Information Systems is a major source of information for research scholars and graduate students in computer science and artificial intelligence, interested in human information processing.
Before his death in March, 1976, A. H. Lightstone delivered the manu script for this book to Plenum Press. Because he died before the editorial work on the manuscript was completed, I agreed (in the fall of 1976) to serve as a surrogate author and to see the project through to completion. I have changed the manuscript as little as possible, altering certain passages to correct oversights. But the alterations are minor; this is Lightstone's book. H. B. Enderton vii Preface This is a treatment of the predicate calculus in a form that serves as a foundation for nonstandard analysis. Classically, the predicates and variables of the predicate calculus are kept distinct, inasmuch as no variable is also a predicate; moreover, each predicate is assigned an order, a unique natural number that indicates the length of each tuple to which the predicate can be prefixed. These restrictions are dropped here, in order to develop a flexible, expressive language capable of exploiting the potential of nonstandard analysis. To assist the reader in grasping the basic ideas of logic, we begin in Part I by presenting the propositional calculus and statement systems. This provides a relatively simple setting in which to grapple with the some times foreign ideas of mathematical logic. These ideas are repeated in Part II, where the predicate calculus and semantical systems are studied."
One high-level ability of the human brain is to understand what it has learned. This seems to be the crucial advantage in comparison to the brain activity of other primates. At present we are technologically almost ready to artificially reproduce human brain tissue, but we still do not fully understand the information processing and the related biological mechanisms underlying this ability. Thus an electronic clone of the human brain is still far from being realizable. At the same time, around twenty years after the revival of the connectionist paradigm, we are not yet satisfied with the typical subsymbolic attitude of devices like neural networks: we can make them learn to solve even difficult problems, but without a clear explanation of why a solution works. Indeed, to widely use these devices in a reliable and non elementary way we need formal and understandable expressions of the learnt functions. of being tested, manipulated and composed with These must be susceptible other similar expressions to build more structured functions as a solution of complex problems via the usual deductive methods of the Artificial Intelligence. Many effort have been steered in this directions in the last years, constructing artificial hybrid systems where a cooperation between the sub symbolic processing of the neural networks merges in various modes with symbolic algorithms. In parallel, neurobiology research keeps on supplying more and more detailed explanations of the low-level phenomena responsible for mental processes.
Fuzzy Set Theory and Advanced Mathematical Applications contains contributions by many of the leading experts in the field, including coverage of the mathematical foundations of the theory, decision making and systems science, and recent developments in fuzzy neural control. The book supplies a readable, practical toolkit with a clear introduction to fuzzy set theory and its evolution in mathematics and new results on foundations of fuzzy set theory, decision making and systems science, and fuzzy control and neural systems. Each chapter is self-contained, providing up-to-date coverage of its subject. Audience: An important reference work for university students, and researchers and engineers working in both industrial and academic settings.
Knowledge discovery is an area of computer science that attempts to uncover interesting and useful patterns in data that permit a computer to perform a task autonomously or assist a human in performing a task more efficiently. Soft Computing for Knowledge Discovery provides a self-contained and systematic exposition of the key theory and algorithms that form the core of knowledge discovery from a soft computing perspective. It focuses on knowledge representation, machine learning, and the key methodologies that make up the fabric of soft computing - fuzzy set theory, fuzzy logic, evolutionary computing, and various theories of probability (e.g. naive Bayes and Bayesian networks, Dempster-Shafer theory, mass assignment theory, and others). In addition to describing many state-of-the-art soft computing approaches to knowledge discovery, the author introduces Cartesian granule features and their corresponding learning algorithms as an intuitive approach to knowledge discovery. This new approach embraces the synergistic spirit of soft computing and exploits uncertainty in order to achieve tractability, transparency and generalization. Parallels are drawn between this approach and other well known approaches (such as naive Bayes and decision trees) leading to equivalences under certain conditions. The approaches presented are further illustrated in a battery of both artificial and real-world problems. Knowledge discovery in real-world problems, such as object recognition in outdoor scenes, medical diagnosis and control, is described in detail. These case studies provide further examples of how to apply the presented concepts and algorithms to practical problems. The author provides web page access to an online bibliography, datasets, source codes for several algorithms described in the book, and other information. Soft Computing for Knowledge Discovery is for advanced undergraduates, professionals and researchers in computer science, engineering and business information systems who work or have an interest in the dynamic fields of knowledge discovery and soft computing.
Advances in Computational Intelligence and Learning: Methods and Applications presents new developments and applications in the area of Computational Intelligence, which essentially describes methods and approaches that mimic biologically intelligent behavior in order to solve problems that have been difficult to solve by classical mathematics. Generally Fuzzy Technology, Artificial Neural Nets and Evolutionary Computing are considered to be such approaches. The Editors have assembled new contributions in the areas of fuzzy sets, neural sets and machine learning, as well as combinations of them (so called hybrid methods) in the first part of the book. The second part of the book is dedicated to applications in the areas that are considered to be most relevant to Computational Intelligence.
Mathematics of Fuzzy Sets: Logic, Topology and Measure Theory is a major attempt to provide much-needed coherence for the mathematics of fuzzy sets. Much of this book is new material required to standardize this mathematics, making this volume a reference tool with broad appeal as well as a platform for future research. Fourteen chapters are organized into three parts: mathematical logic and foundations (Chapters 1-2), general topology (Chapters 3-10), and measure and probability theory (Chapters 11-14). Chapter 1 deals with non-classical logics and their syntactic and semantic foundations. Chapter 2 details the lattice-theoretic foundations of image and preimage powerset operators. Chapters 3 and 4 lay down the axiomatic and categorical foundations of general topology using lattice-valued mappings as a fundamental tool. Chapter 3 focuses on the fixed-basis case, including a convergence theory demonstrating the utility of the underlying axioms. Chapter 4 focuses on the more general variable-basis case, providing a categorical unification of locales, fixed-basis topological spaces, and variable-basis compactifications. Chapter 5 relates lattice-valued topologies to probabilistic topological spaces and fuzzy neighborhood spaces. Chapter 6 investigates the important role of separation axioms in lattice-valued topology from the perspective of space embedding and mapping extension problems, while Chapter 7 examines separation axioms from the perspective of Stone-Cech-compactification and Stone-representation theorems. Chapters 8 and 9 introduce the most important concepts and properties of uniformities, including the covering and entourage approaches and the basic theory of precompact or complete [0,1]-valued uniform spaces. Chapter 10 sets out the algebraic, topological, and uniform structures of the fundamentally important fuzzy real line and fuzzy unit interval. Chapter 11 lays the foundations of generalized measure theory and representation by Markov kernels. Chapter 12 develops the important theory of conditioning operators with applications to measure-free conditioning. Chapter 13 presents elements of pseudo-analysis with applications to the Hamilton Jacobi equation and optimization problems. Chapter 14 surveys briefly the fundamentals of fuzzy random variables which are [0,1]-valued interpretations of random sets.
Stochastic analysis is not only a thriving area of pure mathematics with intriguing connections to partial differential equations and differential geometry. It also has numerous applications in the natural and social sciences (for instance in financial mathematics or theoretical quantum mechanics) and therefore appears in physics and economics curricula as well. However, existing approaches to stochastic analysis either presuppose various concepts from measure theory and functional analysis or lack full mathematical rigour. This short book proposes to solve the dilemma: By adopting E. Nelson's "radically elementary" theory of continuous-time stochastic processes, it is based on a demonstrably consistent use of infinitesimals and thus permits a radically simplified, yet perfectly rigorous approach to stochastic calculus and its fascinating applications, some of which (notably the Black-Scholes theory of option pricing and the Feynman path integral) are also discussed in the book.
In recent years, there have been several attempts to define a logic for information retrieval (IR). The aim was to provide a rich and uniform representation of information and its semantics with the goal of improving retrieval effectiveness. The basis of a logical model for IR is the assumption that queries and documents can be represented effectively by logical formulae. To retrieve a document, an IR system has to infer the formula representing the query from the formula representing the document. This logical interpretation of query and document emphasizes that relevance in IR is an inference process. The use of logic to build IR models enables one to obtain models that are more general than earlier well-known IR models. Indeed, some logical models are able to represent within a uniform framework various features of IR systems such as hypermedia links, multimedia data, and user's knowledge. Logic also provides a common approach to the integration of IR systems with logical database systems. Finally, logic makes it possible to reason about an IR model and its properties. This latter possibility is becoming increasingly more important since conventional evaluation methods, although good indicators of the effectiveness of IR systems, often give results which cannot be predicted, or for that matter satisfactorily explained. However, logic by itself cannot fully model IR. The success or the failure of the inference of the query formula from the document formula is not enough to model relevance in IR. It is necessary to take into account the uncertainty inherent in such an inference process. In 1986, Van Rijsbergen proposed the uncertainty logical principle to model relevance as an uncertain inference process. When proposing the principle, Van Rijsbergen was not specific about which logic and which uncertainty theory to use. As a consequence, various logics and uncertainty theories have been proposed and investigated. The choice of an appropriate logic and uncertainty mechanism has been a main research theme in logical IR modeling leading to a number of logical IR models over the years. Information Retrieval: Uncertainty and Logics contains a collection of exciting papers proposing, developing and implementing logical IR models. This book is appropriate for use as a text for a graduate-level course on Information Retrieval or Database Systems, and as a reference for researchers and practitioners in industry.
This monograph studies the logical aspects of domains as used in de notational semantics of programming languages. Frameworks of domain logics are introduced; these serve as foundations for systematic derivations of proof systems from denotational semantics of programming languages. Any proof system so derived is guaranteed to agree with denotational se mantics in the sense that the denotation of any program coincides with the set of assertions true of it. The study focuses on two categories for dena tational semantics: SFP domains, and the less standard, but important, category of stable domains. The intended readership of this monograph includes researchers and graduate students interested in the relation between semantics of program ming languages and formal means of reasoning about programs. A basic knowledge of denotational semantics, mathematical logic, general topology, and category theory is helpful for a full understanding of the material. Part I SFP Domains Chapter 1 Introduction This chapter provides a brief exposition to domain theory, denotational se mantics, program logics, and proof systems. It discusses the importance of ideas and results on logic and topology to the understanding of the relation between denotational semantics and program logics. It also describes the motivation for the work presented by this monograph, and how that work fits into a more general program. Finally, it gives a short summary of the results of each chapter. 1. 1 Domain Theory Programming languages are languages with which to perform computa tion."
This book is about Granular Computing (GC) - an emerging conceptual and of information processing. As the name suggests, GC concerns computing paradigm processing of complex information entities - information granules. In essence, information granules arise in the process of abstraction of data and derivation of knowledge from information. Information granules are everywhere. We commonly use granules of time (seconds, months, years). We granulate images; millions of pixels manipulated individually by computers appear to us as granules representing physical objects. In natural language, we operate on the basis of word-granules that become crucial entities used to realize interaction and communication between humans. Intuitively, we sense that information granules are at the heart of all our perceptual activities. In the past, several formal frameworks and tools, geared for processing specific information granules, have been proposed. Interval analysis, rough sets, fuzzy sets have all played important role in knowledge representation and processing. Subsequently, information granulation and information granules arose in numerous application domains. Well-known ideas of rule-based systems dwell inherently on information granules. Qualitative modeling, being one of the leading threads of AI, operates on a level of information granules. Multi-tier architectures and hierarchical systems (such as those encountered in control engineering), planning and scheduling systems all exploit information granularity. We also utilize information granules when it comes to functionality granulation, reusability of information and efficient ways of developing underlying information infrastructures.
Since the introduction of genetic algorithms in the 1970s, an enormous number of articles together with several significant monographs and books have been published on this methodology. As a result, genetic algorithms have made a major contribution to optimization, adaptation, and learning in a wide variety of unexpected fields. Over the years, many excellent books in genetic algorithm optimization have been published; however, they focus mainly on single-objective discrete or other hard optimization problems under certainty. There appears to be no book that is designed to present genetic algorithms for solving not only single-objective but also fuzzy and multiobjective optimization problems in a unified way. Genetic Algorithms And Fuzzy Multiobjective Optimization introduces the latest advances in the field of genetic algorithm optimization for 0-1 programming, integer programming, nonconvex programming, and job-shop scheduling problems under multiobjectiveness and fuzziness. In addition, the book treats a wide range of actual real world applications. The theoretical material and applications place special stress on interactive decision-making aspects of fuzzy multiobjective optimization for human-centered systems in most realistic situations when dealing with fuzziness. The intended readers of this book are senior undergraduate students, graduate students, researchers, and practitioners in the fields of operations research, computer science, industrial engineering, management science, systems engineering, and other engineering disciplines that deal with the subjects of multiobjective programming for discrete or other hard optimization problems under fuzziness. Real world research applications are used throughout the book to illustrate the presentation. These applications are drawn from complex problems. Examples include flexible scheduling in a machine center, operation planning of district heating and cooling plants, and coal purchase planning in an actual electric power plant.
The significance of foundational debate in mathematics that took place in the 1920s seems to have been recognized only in circles of mathematicians and philosophers. A period in the history of mathematics when mathematics and philosophy, usually so far away from each other, seemed to meet. The foundational debate is presented with all its brilliant contributions and its shortcomings, its new ideas and its misunderstandings.
Fuzzy Sets in Decision Analysis, Operations Research and Statistics includes chapters on fuzzy preference modeling, multiple criteria analysis, ranking and sorting methods, group decision-making and fuzzy game theory. It also presents optimization techniques such as fuzzy linear and non-linear programming, applications to graph problems and fuzzy combinatorial methods such as fuzzy dynamic programming. In addition, the book also accounts for advances in fuzzy data analysis, fuzzy statistics, and applications to reliability analysis. These topics are covered within four parts: * Decision Making, * Mathematical Programming, * Statistics and Data Analysis, and * Reliability, Maintenance and Replacement. The scope and content of the book has resulted from multiple interactions between the editor of the volume, the series editors, the series advisory board, and experts in each chapter area. Each chapter was written by a well-known researcher on the topic and reviewed by other experts in the area. These expert reviewers sometimes became co-authors because of the extent of their contribution to the chapter.As a result, twenty-five authors from twelve countries and four continents were involved in the creation of the 13 chapters, which enhances the international character of the project and gives an idea of how carefully the Handbook has been developed.
Convexity of sets in linear spaces, and concavity and convexity of functions, lie at the root of beautiful theoretical results that are at the same time extremely useful in the analysis and solution of optimization problems, including problems of either single objective or multiple objectives. Not all of these results rely necessarily on convexity and concavity; some of the results can guarantee that each local optimum is also a global optimum, giving these methods broader application to a wider class of problems. Hence, the focus of the first part of the book is concerned with several types of generalized convex sets and generalized concave functions. In addition to their applicability to nonconvex optimization, these convex sets and generalized concave functions are used in the book's second part, where decision-making and optimization problems under uncertainty are investigated. Uncertainty in the problem data often cannot be avoided when dealing with practical problems. Errors occur in real-world data for a host of reasons. However, over the last thirty years, the fuzzy set approach has proved to be useful in these situations. It is this approach to optimization under uncertainty that is extensively used and studied in the second part of this book. Typically, the membership functions of fuzzy sets involved in such problems are neither concave nor convex. They are, however, often quasiconcave or concave in some generalized sense. This opens possibilities for application of results on generalized concavity to fuzzy optimization. Despite this obvious relation, applying the interface of these two areas has been limited to date. It is hoped that the combination of ideas and results from the field of generalized concavity on the one hand and fuzzy optimization on the other hand outlined and discussed in Generalized Concavity in Fuzzy Optimization and Decision Analysis will be of interest to both communities. Our aim is to broaden the classes of problems that the combination of these two areas can satisfactorily address and solve.
This book contains the lectures given at the NATO ASI 910820 "Cellular Automata and Cooperative Systems" Meeting which was held at the Centre de Physique des Houches, France, from June 22 to July 2, 1992. This workshop brought together mathematical physicists, theoretical physicists and mathe maticians working in fields related to local interacting systems, cellular and probabilistic automata, statistical physics, and complexity theory, as well as applications of these fields. We would like to thank our sponsors and supporters whose interest and help was essential for the success of the meeting: the NATO Scientific Affairs Division, the DRET (Direction des Recherches, Etudes et Techniques), the Ministere des Affaires Etrangeres, the National Science Foundation. We would also like to thank all the secretaries who helped us during the preparation of the meeting, in particular Maryse Cohen-Solal (CPT, Marseille) and Janice Nowinski (Courant Institute, New York). We are grateful for the fine work of Mrs. Gladys Cavallone in preparing this volume."
Uncertainty has been of concern to engineers, managers and . scientists for many centuries. In management sciences there have existed definitions of uncertainty in a rather narrow sense since the beginning of this century. In engineering and uncertainty has for a long time been considered as in sciences, however, synonymous with random, stochastic, statistic, or probabilistic. Only since the early sixties views on uncertainty have ~ecome more heterogeneous and more tools to model uncertainty than statistics have been proposed by several scientists. The problem of modeling uncertainty adequately has become more important the more complex systems have become, the faster the scientific and engineering world develops, and the more important, but also more difficult, forecasting of future states of systems have become. The first question one should probably ask is whether uncertainty is a phenomenon, a feature of real world systems, a state of mind or a label for a situation in which a human being wants to make statements about phenomena, i. e. , reality, models, and theories, respectively. One cart also ask whether uncertainty is an objective fact or just a subjective impression which is closely related to individual persons. Whether uncertainty is an objective feature of physical real systems seems to be a philosophical question. This shall not be answered in this volume.
Assuming that the reader is familiar with sheaf theory, the book gives a self-contained introduction to the theory of constructible sheaves related to many kinds of singular spaces, such as cell complexes, triangulated spaces, semialgebraic and subanalytic sets, complex algebraic or analytic sets, stratified spaces, and quotient spaces. The relation to the underlying geometrical ideas are worked out in detail, together with many applications to the topology of such spaces. All chapters have their own detailed introduction, containing the main results and definitions, illustrated in simple terms by a number of examples. The technical details of the proof are postponed to later sections, since these are not needed for the applications.
The theory of finite automata on finite stings, infinite strings, and trees has had a dis tinguished history. First, automata were introduced to represent idealized switching circuits augmented by unit delays. This was the period of Shannon, McCullouch and Pitts, and Howard Aiken, ending about 1950. Then in the 1950s there was the work of Kleene on representable events, of Myhill and Nerode on finite coset congruence relations on strings, of Rabin and Scott on power set automata. In the 1960s, there was the work of Btichi on automata on infinite strings and the second order theory of one successor, then Rabin's 1968 result on automata on infinite trees and the second order theory of two successors. The latter was a mystery until the introduction of forgetful determinacy games by Gurevich and Harrington in 1982. Each of these developments has successful and prospective applications in computer science. They should all be part of every computer scientist's toolbox. Suppose that we take a computer scientist's point of view. One can think of finite automata as the mathematical representation of programs that run us ing fixed finite resources. Then Btichi's SIS can be thought of as a theory of programs which run forever (like operating systems or banking systems) and are deterministic. Finally, Rabin's S2S is a theory of programs which run forever and are nondeterministic. Indeed many questions of verification can be decided in the decidable theories of these automata.
This book shows how the application of fuzzy logic can benefit management, group decision making, strategic planning, supply chain management and other business imperatives. The theoretical analysis is fully supported by real-life case studies. The book develops themes that businesses can use to master effectiveness and quality, work with flexibility, and support continuous learning in the organization and the individual.
Fuzzy Logic and Soft Computing contains contributions from world-leading experts from both the academic and industrial communities. The first part of the volume consists of invited papers by international authors describing possibilistic logic in decision analysis, fuzzy dynamic programming in optimization, linguistic modifiers for word computation, and theoretical treatments and applications of fuzzy reasoning. The second part is composed of eleven contributions from Chinese authors focusing on some of the key issues in the fields: stable adaptive fuzzy control systems, partial evaluations and fuzzy reasoning, fuzzy wavelet neural networks, analysis and applications of genetic algorithms, partial repeatability, rough set reduction for data enriching, limits of agents in process calculus, medium logic and its evolution, and factor spaces canes. These contributions are not only theoretically sound and well-formulated, but are also coupled with applicability implications and/or implementation treatments. The domains of applications realized or implied are: decision analysis, word computation, databases and knowledge discovery, power systems, control systems, and multi-destinational routing. Furthermore, the articles contain materials that are an outgrowth of recently conducted research, addressing fundamental and important issues of fuzzy logic and soft computing.
In decision theory there are basically two appr hes to the modeling of individual choice: one is based on an absolute representation of preferences leading to a ntDnerical expression of preference intensity. This is utility theory. Another approach is based on binary relations that encode pairwise preference. While the former has mainly blossomed in the Anglo-Saxon academic world, the latter is mostly advocated in continental Europe, including Russia. The advantage of the utility theory approach is that it integrates uncertainty about the state of nature, that may affect the consequences of decision. Then, the problems of choice and ranking from the knowledge of preferences become trivial once the utility function is known. In the case of the relational approach, the model does not explicitly accounts for uncertainty, hence it looks less sophisticated. On the other hand it is more descriptive than normative in the first stand because it takes the pairwise preference pattern expressed by the decision-maker as it is and tries to make the best out of it. Especially the preference relation is not supposed to have any property. The main problem with the utility theory approach is the gap between what decision-makers are and can express, and what the theory would like them to be and to be capable of expressing. With the relational approach this gap does not exist, but the main difficulty is now to build up convincing choice rules and ranking rules that may help the decision process.
Non-Classical Logics and their Applications to Fuzzy Subsets is the first major work devoted to a careful study of various relations between non-classical logics and fuzzy sets. This volume is indispensable for all those who are interested in a deeper understanding of the mathematical foundations of fuzzy set theory, particularly in intuitionistic logic, Lukasiewicz logic, monoidal logic, fuzzy logic and topos-like categories. The tutorial nature of the longer chapters, the comprehensive bibliography and index make it suitable as a valuable and important reference for graduate students as well as research workers in the field of non-classical logics. The book is arranged in three parts: Part A presents the most recent developments in the theory of Heyting algebras, MV-algebras, quantales and GL-monoids. Part B gives a coherent and current account of topos-like categories for fuzzy set theory based on Heyting algebra valued sets, quantal sets of M-valued sets. Part C addresses general aspects of non-classical logics including epistemological problems as well as recursive properties of fuzzy logic.
The book consists of articles at the frontier of current research in Algebraic Topology. It presents recent results by top notch experts, and is intended primarily for researchers and graduate students working in the field of algebraic topology. Included is an important article by Cohen, Johnes and Yan on the homology of the space of smooth loops on a manifold M, endowed with the Chas-Sullivan intersection product, as well as an article by Goerss, Henn and Mahowald on stable homotopy groups of spheres, which uses the cutting edge technology of "topological modular forms." |
![]() ![]() You may like...
Key to Advanced Arithmetic for Canadian…
Barnard 1817-1876 Smith, Archibald McMurchy
Hardcover
R895
Discovery Miles 8 950
Alfred Tarski - Early Work in…
Andrew McFarland, Joanna McFarland, …
Hardcover
R3,054
Discovery Miles 30 540
The New Method Arithmetic [microform]
P (Phineas) McIntosh, C a (Carl Adolph) B 1879 Norman
Hardcover
R937
Discovery Miles 9 370
Primary Maths for Scotland Textbook 2A…
Craig Lowther, Antoinette Irwin, …
Paperback
|