![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Computer programming > Programming languages
This book constitutes the refereed proceedings of the Second
International Conference on Interactive Theorem proving, ITP 2011,
held in Berg en Dal, The Netherlands, in August 2011.
Intended for first- or second-year undergraduates, this introduction to discrete mathematics covers the usual topics of such a course, but applies constructivist principles that promote - indeed, require - active participation by the student. Working with the programming language ISETL, whose syntax is close to that of standard mathematical language, the student constructs the concepts in her or his mind as a result of constructing them on the computer in the syntax of ISETL. This dramatically different approach allows students to attempt to discover concepts in a "Socratic" dialog with the computer. The discussion avoids the formal "definition-theorem" approach and promotes active involvement by the reader by its questioning style. An instructor using this text can expect a lively class whose students develop a deep conceptual understanding rather than simply manipulative skills. Topics covered in this book include: the propositional calculus, operations on sets, basic counting methods, predicate calculus, relations, graphs, functions, and mathematical induction.
by Luea Cardelli Ever since Strachey's work in the 1960's, polymorphism has been classified into the parametric and overloading varieties. Parametric polymorphism has been the subject of extensive study for over two decades. Overloading, on the other hand, has often been considered too ad hoc to deserve much attention even though it has been, in some form, an ingredient of virtually every programming lan guage (much more so than parametric polymorphism). With the introduction of object-oriented languages, and in particular with multiple-dispatch object-oriented languages, overloading has become less of a programming convenience and more of a fundamental feature in need of proper explanation. This book provides a compelling framework for the study of run-time over loading and of its interactions with subtyping and with parametric polymorphism. The book also describes applications to object-oriented programming. This new framework is motivated by the relatively recent spread of programming languages that are entirely based on run-time overloading; this fact probably explains why this subject was not investigated earlier. Once properly understood, overloading reveals itself relevant also to the study of older and more conventional (single dispatch) object-oriented languages, clarifying delicate issues of covariance and contravariance of method types, and of run-time type analysis. In the final chapters, a synthesis is made between parametric and overloading polymorphism."
Both object orientation and parallelism are modern programming paradigms which have gained much popularity in the last 10-15 years. Object orientation raises hopes for increased productivity of software generation and maintenance methods. Parallelism can serve to structure a problem but also promises faster program execution. The two areas of computing science in which these paradigms play the most prominent role are programming languages and databases. In programming languages, one can take an academic approach with a primary focus on the generality of the semantics of the language constructs which support the respective paradigm. In databases, one is willing to restrict the power of the constructs in the interest of increased efficiency. Inter- and intra-object parallelism have received an increasing amount of attention in the last few years by researchers in the area of object- oriented programming. At first glance, an object is very similar to a process which offers services to other processes and demands services from them. It has, however, transpired that object-oriented concepts cause problems when combined with parallelism. In programming languages, the introduction of parallelism and the synchronization constraints it brings with it can get in the way of code reusability. In databases, the combination of object orientation and parallelism requires, for example, a generalization of the transaction model, new approaches to the specification of information systems, an implementation model of object communication, and the design of an overall system architecture. There has been insufficient communication between researchers in programming languages and in databases on these issues. Object Orientation with Parallelism and Persistence grew out of a Dagstuhl Seminar of the same title in April 1995 whose goal it was to put the new research area object orientation with parallelism' on an interdisciplinary basis. Object Orientation with Parallelism and Persistence will be of interest to researchers and professionals working in software engineering, programming languages, and database systems.
This book constitutes the refereed proceedings of the 9th
International Symposium on Automated Technology for Verification
and Analysis, ATVA 2011, held in Taipei, Taiwan, in October
2011.
An up-to-date and comprehensive account of set-oriented symbolic manipulation and automated reasoning methods. This book is of interest to graduates and researchers in theoretical computer science and computational logic and automated reasoning.
Assembly language continues to hold a core position in the programming world because of its similar structure to machine language and its very close links to underlying computer-processor architecture and design. These features allow for high processing speed, low memory demands, and the capacity to act directly on the system's hardware. This completely revised second edition of the highly successful Introduction to Assembly Language Programming introduces the reader to assembly language programming and its role in computer programming and design. The focus is on providing readers with a firm grasp of the main features of assembly programming, and how it can be used to improve a computer's performance. The revised edition covers a broad scope of subjects and adds valuable material on protected-mode Pentium programming, MIPS assembly language programming, and use of the NASM and SPIM assemblers for a Linux orientation. All of the language's main features are covered in depth. The book requires only some basic experience with a structured, high-level language. Topics and Features: Introduces assembly language so that readers can benefit from learning its utility with both CISC and RISC processors NEW ].- Employs the freely available NASM assembler, which works with both Microsoft Windows and Linux operating systems NEW ].- Contains a revised chapter on "Basic Computer Organization" NEW].- Uses numerous examples, hands-on exercises, programming code analyses and challenges, and chapter summaries.- Incorporates full new chapters on recursion, protected-mode interrupt processing, and floating-point instructions NEW ]. Assembly language programming is part of several undergraduate curricula in computer science, computer engineering, and electrical engineering. In addition, this newly revised text/reference can be used as an ideal companion resource in a computer organization course or as a resource for professional courses.
This book constitutes the proceedings of the 12th International
Workshop on Computational Logic in Multi-Agent Systems, CLIMA XII,
held in Barcelona, Spain, in July 2011.
This book constitutes the refereed proceedings of the 18th International Symposium on Static Analysis, SAS 2011, held in Venice, Italy, in September 2011. The 22 revised full papers were selected from 67 submissions. Also included in this volume are the abstracts of the invited talks that were given at the symposium by renowned experts in the field. The papers address all aspects of static analysis, including abstract domains, abstract interpretation, abstract testing, data flow analysis, bug detection, program transformation, program verification, security analysis and type checking.
This book constitutes the refereed proceedings of the 4th International Conference, ICMT 2011, held in Zurich, Switzerland in June 2011. The 14 revised full papers were carefully revised and selected from 51 submissions. The scope of the contributions ranges from theoretical and methodological topics to implementation issues and applications. Topics addressed are such as transformation paradigms and languages, transformation algorithms and strategies, implementation and tools, as well as applications and case studies.
The book constitutes the refereed proceedings of the 10th International Conference on Software Composition, SC 2011, held in Zurich, Switzerland, in June/July 2011, co-located with TOOLS 2011 Federated Conferences. The 10 revised full papers and 2 short papers were carefully reviewed and selected from 32 initial submissions for inclusion in the book. The papers reflect all current research in software composition and are organized in topical sections on composition and interfaces, aspects and features, and applications.
These proceedings containall the papers that werepresented at the 4th Inter- tional Conference on Language and Automata Theory and Applications (LATA 2010), held in Trier, Germany, during May 24-28, 2010. The scope of LATA is rather broad, including: algebraic language theory; algorithms on automata and words; automata and logic; automata for system analysis and program veri?cation; automata, concurrency and Petri nets; cel- lar automata; combinatorics on words; computability; computational compl- ity; computer linguistics; data and image compression; decidability questions on words and languages; descriptional complexity; DNA and other models of bio-inspired computing; document engineering; foundations of ?nite state te- nology; fuzzy and rough languages; grammars (Chomsky hierarchy, contextual, multidimensional, uni?cation, categorial, etc. ); grammars and automata arc- tectures; grammatical inference and algorithmic learning; graphs and graph transformation; language varieties and semigroups; language-based cryptog- phy; language-theoretic foundations of arti?cial intelligence and arti?cial life; neuralnetworks;parallelandregulatedrewriting;parsing;patternmatching and pattern recognition; patterns and codes; power series; quantum, chemical and optical computing; semantics; string and combinatorial issues in computational biology and bioinformatics; symbolic dynamics; term rewriting; text algorithms; textretrieval;transducers;trees, treelanguagesandtreemachines;andweighted machines. LATA 2010 received 115 submissions, many among them of good quality. Each one was reviewed by at least three Program Committee members plus, in mostcases, byadditionalexternalreferees. Afterathoroughandvividdiscussion phase, the committee decided to accept 47 papers (which means an acceptance rate of 40. 86%). The conference program also included four invited talk
ETAPS 2010 was the 13th instance of the European Joint Conferences on T- oryandPracticeofSoftware. ETAPSisanannualfederatedconferencethatwas establishedin1998bycombininganumberofexistingandnewconferences. This yearitcomprisedtheusual?vesisterconferences(CC, ESOP, FASE, FOSSACS, TACAS), 19 satellite workshops (ACCAT, ARSPA-WITS, Bytecode, CMCS, COCV, DCC, DICE, FBTC, FESCA, FOSS-AMA, GaLoP, GT-VMT, LDTA, MBT, PLACES, QAPL, SafeCert, WGT, and WRLA) and seven invited l- tures (excluding those that were speci?c to the satellite events). The ?ve main conferences this year received 497 submissions (including 31 tool demonstration papers), 130 of which were accepted (10 tool demos), giving an overall acc- tance rateof 26%, with most of the conferencesat around24%. Congratulations thereforetoalltheauthorswhomadeittothe?nalprogramme Ihopethatmost of the other authors will still have found a way of participating in this exciting event, and that you will all continue submitting to ETAPS and contributing to make of it the best conference on software science and engineering. The events that comprise ETAPS address various aspects of the system - velopmentprocess, including speci?cation, design, implementation, analysisand improvement. The languages, methodologies and tools which support these - tivities are all well within its scope. Di?erent blends of theory and practice are represented, withaninclinationtowardtheorywithapracticalmotivationonthe one hand and soundly based practice on the other. Many of the issues involved in software design apply to systems in general, including hardwaresystems, and the emphasis on software is not intended to be exclusive. ETAPS is a confederation in which each event retains its own identity, with a separate Programme Committee and proceedings. Its format is open-ended, allowing it to grow and evolve as time go
ETAPS 2010 was the 13th instance of the European Joint Conferences on T- oryand Practiceof Software. ETAPS is anannual federatedconference that was establishedin1998bycombininganumberofexistingandnewconferences. This yearitcomprisedtheusual?vesisterconferences(CC, ESOP, FASE, FOSSACS, TACAS), 19 satellite workshops (ACCAT, ARSPA-WITS, Bytecode, CMCS, COCV, DCC, DICE, FBTC, FESCA, FOSS-AMA, GaLoP, GT-VMT, LDTA, MBT, PLACES, QAPL, SafeCert, WGT, and WRLA) and seven invited l- tures (excluding those that were speci?c to the satellite events). The ?ve main conferences this year received 497 submissions (including 31 tool demonstration papers), 130 of which were accepted (10 tool demos), giving an overall acc- tance rate of 26%, with most of the conferences at around 24%. Congratulations thereforetoalltheauthorswhomadeittothe?nalprogramme Ihopethatmost of the other authors will still have found a way of participating in this exciting event, and that you will all continue submitting to ETAPS and contributing to make of it the best conference on software science and engineering. The events that comprise ETAPS address various aspects of the system - velopment process, including speci?cation, design, implementation, analysis and improvement. The languages, methodologies and tools which support these - tivities are all well within its scope. Di?erent blends of theory and practice are represented, withaninclinationtowardtheorywithapracticalmotivationonthe one hand and soundly based practice on the other. Many of the issues involved in software design apply to systems in general, including hardware systems, and the emphasis on software is not intended to be exclusive. ETAPS is a confederation in which each event retains its own identity, with a separate Programme Committee and proceedi
Galois connections provide the order- or structure-preserving passage between two worlds of our imagination - and thus are inherent in hu man thinking wherever logical or mathematical reasoning about cer tain hierarchical structures is involved. Order-theoretically, a Galois connection is given simply by two opposite order-inverting (or order preserving) maps whose composition yields two closure operations (or one closure and one kernel operation in the order-preserving case). Thus, the "hierarchies" in the two opposite worlds are reversed or transported when passing to the other world, and going forth and back becomes a stationary process when iterated. The advantage of such an "adjoint situation" is that information about objects and relationships in one of the two worlds may be used to gain new information about the other world, and vice versa. In classical Galois theory, for instance, properties of permutation groups are used to study field extensions. Or, in algebraic geometry, a good knowledge of polynomial rings gives insight into the structure of curves, surfaces and other algebraic vari eties, and conversely. Moreover, restriction to the "Galois-closed" or "Galois-open" objects (the fixed points of the composite maps) leads to a precise "duality between two maximal subworlds.""
Service-Oriented Computing is a paradigm for developing and providing software that can address many IT challenges, ranging from integrating legacy systems to building new, massively distributed, interoperable, evaluable systems and applications. The widespread use of SOC demonstrates the practical benefits of this approach. Furthermore it raises the standard for reliability, security, and performance for IT providers, system integrators, and software developers. This book documents the main results of Sensoria, an Integrated Project funded by the European Commission in the period 2005-2010. The book presents, as Sensoria's essence, a novel, coherent, and comprehensive approach to the design, formal analysis, automated deployment, and reengineering of service-oriented applications. Following a motivating introduction, the 32 chapters are organized in the following topical parts: modeling in service-oriented architectures; calculi for service-oriented computing; negotiation, planning, and reconfiguration; qualitative analysis techniques for SOC; quantitative analysis techniques for SOC; model-driven development and reverse engineering for service-oriented systems; and case studies and patterns.
In the multi-agent systems area, linking theory to practical applications is still a fertile research topic. The aim of the workshop on Declarative Agent Languages and Technologies (DALT 2009), in its seventh edition this year, is to achieve this goal, which needs developing and using advanced declarative technologies and languages, particularly agent programming, communication languages, and reasoning and decision-making mechanisms. Developing these technologies is a particularly challenging issue from many perspectives: formal foundations, pr- ticalfeasibility, degreeof?exibility, etc. Inthiscontext, thedeclarativeparadigm is arguably the most appropriate as unlike imperative approaches, the focus is onwhatthe solutionshouldaccomplishratherthanondescribing howto acc- plish it. This is because agentcomputing, as a paradigm, is about describing the logic of computation instead of describing how to accomplish it. DALT is about investigating, studying, andusing the declarativeparadigmaswell ascombining declarative and formal approaches with engineering and technology aspects of agents and multi-agent systems. This volume presents the latest developments in the area of declarative l- guagesandtechnologies, whichaimtoproviderigorousframeworksfordesigning, specifying, implementing and verifying autonomous interacting agents. These frameworksarebasedoncomputationallogicsand other formalmethods suchas mathematical models and game theoretical approaches. Using such models and approaches facilitates the development of agents that reason and act rationally while at the same time being able to verify the behavior of these agents against their speci?cation. The main theme of DALT 2009 was the further advan- ment of relevant speci?cation and veri?cation techniques, such as, for instance, modal and epistemic logics, model checking, constraint logic programming, and distributed constraint satisfa
The art, craft, discipline, logic, practice and science of developing large-scale software products needs a professional base. The textbooks in this three-volume set combine informal, engineeringly sound approaches with the rigor of formal, mathematics-based approaches. This volume covers the basic principles and techniques of specifying systems and languages. It deals with modelling the semiotics (pragmatics, semantics and syntax of systems and languages), modelling spatial and simple temporal phenomena, and such specialized topics as modularity (incl. UML class diagrams), Petri nets, live sequence charts, statecharts, and temporal logics, including the duration calculus. Finally, the book presents techniques for interpreter and compiler development of functional, imperative, modular and parallel programming languages. This book is targeted at late undergraduate to early graduate university students, and researchers of programming methodologies. Vol. 1 of this series is a prerequisite text.
This is a monograph about logic. Specifically, it presents the mathe matical theory of the logic of bunched implications, BI: I consider Bl's proof theory, model theory and computation theory. However, the mono graph is also about informatics in a sense which I explain. Specifically, it is about mathematical models of resources and logics for reasoning about resources. I begin with an introduction which presents my (background) view of logic from the point of view of informatics, paying particular attention to three logical topics which have arisen from the development of logic within informatics: * Resources as a basis for semantics; * Proof-search as a basis for reasoning; and * The theory of representation of object-logics in a meta-logic. The ensuing development represents a logical theory which draws upon the mathematical, philosophical and computational aspects of logic. Part I presents the logical theory of propositional BI, together with a computational interpretation. Part II presents a corresponding devel opment for predicate BI. In both parts, I develop proof-, model- and type-theoretic analyses. I also provide semantically-motivated compu tational perspectives, so beginning a mathematical theory of resources. I have not included any analysis, beyond conjecture, of properties such as decidability, finite models, games or complexity. I prefer to leave these matters to other occasions, perhaps in broader contexts.
This remarkable anthology allows the pioneers who orchestrated the major breakthroughs in operating system technology to describe their work in their own words. From the batch processing systems of the 1950s to the distributed systems of the 1990s, Tom Kilburn, David Howarth, Bill Lynch, Fernando Corbato, Robert Daley, Sandy Fraser, Dennis Ritchie, Ken Thompson, Edsger Dijkstra, Per Brinch Hansen, Soren Lauesen, Barbara Liskov, Joe Stoy, Christopher Strachey, Butler Lampson, David Redell, Brian Randell, Andrew Tanenbaum, and others describe the systems they designed. The volume details such classic operating systems as the Atlas, B5000, Exec II, Egdon, CTSS, Multics, Titan, Unix, THE, RC 4000, Venus, Boss 2, Solo, OS 6, Alto, Pilot, Star, WFS, Unix United, and Amoeba systems. An introductory essay on the evolution of operating systems summarizes the papers and helps puts them into a larger perspective. This provocative journey captures the historic contributions of operating systems to software design, concurrent programming, graphic user interfaces, file systems, personal computing, and distributed systems. It also fully portrays how operating systems designers think. It's ideal for everybody in the field, from students to professionals, academics to enthusiasts.
We are pleased to present the proceedings of the Second International Conf- ence on Software LanguageEngineering (SLE 2009). The conference was held in Denver, Colorado (USA) during October 5-6, 2009 and was co-located with the th 12 IEEE/ACM International Conference on Model-Driven Engineering L- th guages and Systems (MODELS 2009) and the 8 ACM International Conf- ence on Generative Programming and Component Engineering (GPCE 2009). TheSLEconferenceseriesisdevotedtoawiderangeoftopicsrelatedtoarti?cial languages in software engineering. SLE is an international research forum that brings together researchers and practitioners from both industry and academia to expand the frontiers of software language engineering. SLE'sforemostmissionis to encourageand organizecommunicationbetween communities that have traditionally looked at software languagesfrom di?erent, more specialized, and yet complementary perspectives. SLE emphasizes the f- damental notion of languages, as opposed to any realization in speci?c technical spaces. In this context, the term "software language" comprises all sorts of - ti?cial languages used in software development, including general-purpose p- gramming languages, domain-speci?c languages, modeling and meta-modeling languages, data models, and ontologies. Software language engineering is the application of a systematic, disciplined, quanti?able approach to the devel- ment, use, andmaintenanceoftheselanguages. TheSLEconferenceisconcerned with all phases of the lifecycle of software languages; these include the design, implementation, documentation, testing, deployment, evolution, recovery, and retirement of languages. Of special interest are tools, techniques, methods, and formalisms that support these activities. In particular, tools are often based on, or automatically generated from, a formal description of the language.
The need for a comprehensive survey-type exposition on formal languages and related mainstream areas of computer science has been evident for some years. In the early 1970s, when the book Formal Languages by the second mentioned editor appeared, it was still quite feasible to write a comprehensive book with that title and include also topics of current research interest. This would not be possible anymore. A standard-sized book on formal languages would either have to stay on a fairly low level or else be specialized and restricted to some narrow sector of the field. The setup becomes drastically different in a collection of contributions, where the best authorities in the world join forces, each of them concentrat ing on their own areas of specialization. The present three-volume Handbook constitutes such a unique collection. In these three volumes we present the current state of the art in formallanguage theory. We were most satisfied with the enthusiastic response given to our request for contributions by specialists representing various subfields. The need for a Handbook of Formal Languages was in many answers expressed in different ways: as an easily accessible his torical reference, a general source of information, an overall course-aid, and a compact collection of material for self-study. We are convinced that the final result will satisfy such various needs."
The A-calculus was invented by Church in the 1930s with the purpose of sup plying a logical foundation for logic and mathematics 25]. Its use by Kleene as a coding for computable functions makes it the first programming lan guage, in an abstract sense, exactly as the Thring machine can be considered the first computer machine 57]. The A-calculus has quite a simple syntax (with just three formation rules for terms) and a simple operational seman tics (with just one operation, substitution), and so it is a very basic setting for studying computation properties. The first contact between A-calculus and real programming languages was in the years 1956-1960, when McCarthy developed the LISP programming language, inspired from A-calculus, which is the first "functional" program ming language, Le., where functions are first-dass citizens 66]. But the use of A-calculus as an abstract paradigm for programming languages started later as the work of three important scientists: Strachey, Landin and B6hm."
As today's most complex computing environment, the Internet confronts IT researchers, system designers, and application developers with completely new challenges and, as a fascinating new computing paradigm, agent technology has recently attracted broad interest and strong hopes for shaping the future information society. Relating both, the Internet and agents, opens up a whole new range of advanced applications in vibrant subfields of information technology such as middleware, mobile commerce, e-learning, collaborative working, and intelligent information services. Many modern advanced systems are likely to exploit Internet agents - and exploiting Internet agents mostly means dealing with coordination models and technologies of various sorts. This monograph-like anthology is the first systematic guide to models and enabling technologies for the coordination of intelligent agents on the Internet and respective applications.
This book is the third in a series of books collecting the best papers from the three main regional conferences on electronic system design languages, HDLCon in the United States, APCHDL in Asia-Pacific and FDL in Europe. Being APCHDL bi-annual, this book presents a selection of papers from HDLCon'Ol and FDL'OI. HDLCon is the premier HDL event in the United States. It originated in 1999 from the merging of the International Verilog Conference and the Spring VHDL User's Forum. The scope of the conference expanded from specialized languages such as VHDL and Verilog to general purpose languages such as C++ and Java. In 2001 it was held in February in Santa Clara, CA. Presentations from design engineers are technical in nature, reflecting real life experiences in using HDLs. EDA vendors presentations show what is available - and what is planned-for design tools that utilize HDLs, such as simulation and synthesis tools. The Forum on Design Languages (FDL) is the European forum to exchange experiences and learn of new trends, in the application of languages and the associated design methods and tools, to design complex electronic systems. FDL'OI was held in Lyon, France, around seven interrelated workshops, Hardware Description Languages, Analog and Mixed signal Specification, C/C++ HW/SW Specification and Design, Design Environments & Languages, Real-Time specification for embedded Systems, Architecture Modeling and Reuse and System Specification & Design Languages. |
You may like...
Capability Management in Digital…
Kurt Sandkuhl, Janis Stirna
Hardcover
R2,257
Discovery Miles 22 570
Big Data Governance and Perspectives in…
Sheryl Kruger Strydom, Moses Strydom
Hardcover
R4,886
Discovery Miles 48 860
|