![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Computer programming > Programming languages
This book constitutes the proceedings of the 22nd International Conference on Compiler Construction, CC 2013, held as part of the European Joint Conferences on Theory and Practice of Software, ETAPS 2013, which took place in Rome, Italy, in March 2013. The 13 papers presented in this book were carefully reviewed and selected from 53 submissions. They have been organized into five topical sections on register allocation, pointer analysis, data and information flow, machine learning, and refactoring.
The LNCS journal Transactions on Aspect-Oriented Software Development is devoted to all facets of aspect-oriented software development (AOSD) techniques in the context of all phases of the software life cycle, from requirements and design to implementation, maintenance and evolution. The focus of the journal is on approaches for systematic identification, modularization, representation and composition of crosscutting concerns, i.e., the aspects and evaluation of such approaches and their impact on improving quality attributes of software systems. This volume, the 10th in the Transactions on Aspect-Oriented Software Development series, contains revised, extended versions of the top five papers presented at AOSD 2012. The topics covered include debugging, analysis of software product lines, distributed software architectures, and empirical study of language support for software evolution.
Mountaineers use pitons to protect themselves from falls. The lead climber wears a harness to which a rope is tied. As the climber ascends, the rope is paid out by a partner on the ground. As described thus far, the climber receives no protection from the rope or the partner. However, the climber generally carries several spike-like pitons and stops when possible to drive one into a small crack or crevice in the rock face. After climbing just above the piton, the climber clips the rope to the piton, using slings and carabiners. A subsequent fall would result in the climber hanging from the piton if the piton stays in the rock, the slings and carabiners do not fail, the rope does not break, the partner is holding the rope taut and secure, and the climber had not climbed too high above the piton before falling. The climber's safety clearly depends on all of the components of the system. But the piton is distinguished because it connects the natural to the artificial. In 1987 I designed an assembly-level language for Warren Hunt's FM8501 verified microprocessor. I wanted the language to be conveniently used as the object code produced by verified compilers. Thus, I envisioned the language as the first software link in a trusted chain from verified hardware to verified applications programs. Thinking of the hardware as the "rock" I named the language "Piton."
JR is an extension of the Java programming language with additional concurrency mechanisms based on those in the SR (Synchronizing Resources) programming language. The JR implementation executes on UNIX-based systems (Linux, Mac OS X, and Solaris) and Windows-based systems. It is available free from the JR webpage. This book describes the JR programming language and illustrates how it can be used to write concurrent programs for a variety of applications. This text presents numerous small and large example programs. The source code for all programming examples and the given parts of all programming exercises are available on the JR webpage. Dr. Ronald A. Olsson and Dr. Aaron W. Keen, the authors of this text, are the designers and implementors of JR.
This book constitutes the Proceedings of the IFIP Working Conference PRO COMET'98, held 8-12 June 1998 at Shelter Island, N.Y. The conference is organized by the t'wo IFIP TC 2 Working Groups 2.2 Formal Description of Programming Concepts and 2.3 Programming Methodology. WG2.2 and WG2.3 have been organizing these conferences every four years for over twenty years. The aim of such Working Conferences organized by IFIP Working Groups is to bring together leading scientists in a given area of computer science. Participation is by invitation only. As a result, these conferences distinguish themselves from other meetings by extensive and competent technical discus sions. PROCOMET stands for Programming Concepts and Methods, indicating that the area of discussion for the conference is the formal description of pro gramming concepts and methods, their tool support, and their applications. At PROCOMET working conferences, papers are presented from this whole area, reflecting the interest of the individuals in WG2.2 and WG2.3.
The development of a methodology for using logic databases is essential if new users are to be able to use these systems effectively to solve their problems, and this remains a largely unrealized goal. A workshop was organized in conjunction with the ILPS '93 Conference in Vancouver in October 1993 to provide a forum for users and implementors of deductive systems to share their experience. The emphasis was on the use of deductive systems. In addition to paper presentations, a number of systems were demonstrated. The papers of this book were drawn largely from the papers presented at the workshop, which have been extended and revised for inclusion here, and also include some papers describing interesting applications that were not discussed at the workshop. The applications described here should be seen as a starting point: a number of promising application domains are identified, and several interesting application packages are described, which provide the inspiration for further development.Declarative rule-based database systems hold a lot of promise in a wide range of application domains, and we need a continued stream of application development to better understand this potential and how to use it effectively. This book contains the broadest collection to date of papers describing implemented, significant applications of logic databases, and database systems as well as potential database users in such areas as scientific data management and complex decision support.
To construct a compiler for a modern higher-level programming languagel one needs to structure the translation to a machine-like intermediate language in a way that reflects the semantics of the language. little is said about such struc turing in compiler texts that are intended to cover a wide variety of program ming languages. More is said in the Iiterature on semantics-directed compiler construction [1] but here too the viewpoint is very general (though limited to 1 languages with a finite number of syntactic types). On the other handl there is a considerable body of work using the continuation-passing transformation to structure compilers for the specific case of call-by-value languages such as SCHEME and ML [21 3]. ln this paperl we will describe a method of structuring the translation of ALGOL-like languages that is based on the functor-category semantics devel oped by Reynolds [4] and Oles [51 6]. An alternative approach using category theory to structure compilers is the early work of F. L. Morris [7]1 which anticipates our treatment of boolean expressionsl but does not deal with procedures. 2 Types and Syntax An ALGOL-like language is a typed lambda calculus with an unusual repertoire of primitive types. Throughout most of this paper we assume that the primi tive types are comm(and) int(eger)exp(ression) int(eger)acc(eptor) int(eger)var(iable) I and that the set 8 of types is the least set containing these primitive types and closed under the binary operation -.
This book had its genesis in the following piece of computer mail: From allegra joan-b Tue Dec 18 09:15:54 1984 To: sola hjb Subject: lispm Hank, I've been talking with Mark Plotnik and Bill Gale about asking you to conduct a basic course on using the lisp machine. Mark, for instance, would really like to cover basics like the flavor system, etc., so he could start doing his own programming without a lot of trial and error, and Bill and I would be interested in this, too. I'm quite sure that Mark Jones, Bruce, Eric and Van would also be really interested. Would you like to do it? Bill has let me know that if you'd care to set something up, he's free to meet with us anytime this week or next (although I'll only be here on Wed. next week) so we can come up with a plan. What do you think? Joan.
This book constitutes the refereed proceedings of the 4th International Symposium on Unifying Theories of Programming, UTP 2012, held in Paris, France, in August 2012, co-located with the 18th International Symposium on Formal Methods, FM 2012. The 8 revised full papers presented together with 2 invited talks and one invited lecture were carefully reviewed and selected from 13 submissions.
This book constitutes the thoroughly refereed post-proceedings of the 5th International Conference on Software Language Engineering, SLE 2012, held in Dresden, Germany, in September 2012. The 17 papers presented together with 2 tool demonstration papers were carefully reviewed and selected from 62 submissions. SLE's foremost mission is to encourage and organize communication between communities that have traditionally looked at software languages from different, more specialized, and yet complementary perspectives. SLE emphasizes the fundamental notion of languages as opposed to any realization in specific technical spaces.
Automatic code generation is an essential cornerstone of model-driven approaches to software development. Currently, lots of techniques are available that support the specification and implementation of code generators, such as engines based on templates or rule-based transformations. All those techniques have in common that code generators are either directly programmed or described by means of textual specifications. This monograph presents Genesys, a general approach, which advocates the graphical development of code generators for arbitrary source and target languages, on the basis of models and services. In particular, it is designed to support incremental language development on arbitrary metalevels. The use of models allows building code generators in a truly platform-independent and domain-specific way. Furthermore, models are amenable to formal verification methods such as model checking, which increase the reliability and robustness of the code generators. Services enable the reuse and integration of existing code generation frameworks and tools regardless of their complexity, and at the same time manifest as easy-to-use building blocks which facilitate agile development through quick interchangeability. Both, models and services, are reusable and thus form a growing repository for the fast creation and evolution of code generators.
This book constitutes the proceedings of the 17th International Conference on Fundamental Approaches to Software Engineering, FASE 2014, held as part of the European Joint Conferences on Theory and Practice of Software, ETAPS 2014, which took place in Grenoble, France, in April 2014. The 28 papers included in this volume, together with one invited talk, were carefully reviewed and selected from 125 submissions. They have been organized in topical sections on: modeling and model transformation; time and performance; static analysis; scenario-based specification; software verification; analysis and repair; verification and validation; graph transformation and debugging and testing.
Learn how to write R code with fewer bugs. The problem with programming is that you are always one typo away from writing something silly. Likewise with data analysis, a small mistake in your model can lead to a big mistake in your results. Combining the two disciplines means that it is all too easy for a missed minus sign to generate a false prediction that you don't spot until it's too late. Testing is the only way to be sure that your code, and your results, are correct. Testing R Code teaches you how to perform development-time testing using the testthat package, allowing you to ensure that your code works as intended. The book also teaches run-time testing using the assertive package; enabling your users to correctly run your code. After beginning with an introduction to testing in R, the book explores more advanced cases such as integrating tests into R packages; testing code that accesses databases; testing C++ code with Rcpp; and testing graphics. Each topic is explained with real-world examples, and has accompanying exercises for readers to practise their skills - only a small amount of experience with R is needed to get started!
This book constitutes the proceedings of the 20th International Conference on Tools and Algorithms for the Construction and Analysis of Systems, TACAS 2014, which took place in Grenoble, France, in April 2014, as part of the European Joint Conferences on Theory and Practice of Software, ETAPS 2014. The total of 42 papers included in this volume, consisting of 26 research papers, 3 case study papers, 6 regular tool papers and 7 tool demonstrations papers, were carefully reviewed and selected from 161 submissions. In addition the book contains one invited contribution. The papers are organized in topical sections named: decision procedures and their application in analysis; complexity and termination analysis; modeling and model checking discrete systems; timed and hybrid systems; monitoring, fault detection and identification; competition on software verification; specifying and checking linear time properties; synthesis and learning; quantum and probabilistic systems; as well as tool demonstrations and case studies.
This book describes a new class of computing devices which are
becoming omnipresent in every day life. They make information
access and processing easily available for everyone from anywhere
at any time. Mobility, wireless connectivity, di- versity, and
ease-of-use are the magic keywords of Pervasive and Ubiquitous
Computing. The book covers these front-end devices as well as their
operating systems and the back-end infrastructure which integrate
these pervasive components into a seamless IT world. A strong
emphasis is placed on the underlying technologies and standards
applied when building up pervasive solutions. These fundamental
topics include commonly used terms such as XML, WAP, UMTS, GPRS,
Bluetooth, Jini, transcoding, and cryptography, to mention just a
few. Voice, Web Application Servers, Portals, Web Services, and
Synchronized and Device Management are new in the second
edition.
This book introduces the statistical software R to the image processing community in an intuitive and practical manner. R brings interesting statistical and graphical tools which are important and necessary for image processing techniques. Furthermore, it has been proved in the literature that R is among the most reliable, accurate and portable statistical software available. Both the theory and practice of R code concepts and techniques are presented and explained, and the reader is encouraged to try their own implementation to develop faster, optimized programs. Those who are new to the field of image processing and to R software will find this work a useful introduction. By reading the book alongside an active R session, the reader will experience an exciting journey of learning and programming.
The LNCS journal Transactions on Aspect-Oriented Software Development is devoted to all facets of aspect-oriented software development (AOSD) techniques in the context of all phases of the software life cycle, from requirements and design to implementation, maintenance and evolution. The focus of the journal is on approaches for systematic identification, modularization, representation and composition of crosscutting concerns, i.e., the aspects and evaluation of such approaches and their impact on improving quality attributes of software systems. This volume, the 11th in the Transactions on Aspect-Oriented Software Development series, consists of two parts. The first part focuses on runtime verification and analysis, highlighting runtime verification as a "killer" application of aspect-orientation. The second part contains revised and extended versions of the five best papers submitted to Modularity: aosd 2013, presenting current research related to modularity and covering topics such as formal methods and type systems, static analysis approaches for software architectures, model-driven engineering and model composition, aspect-oriented programming, event-driven programming and reactive programming
Modula-2 is a simple yet powerful programming language that is suitable for a wide variety of applications. It is based on Pascal, a successful programming language that was introduced in 1970 by Niklaus Wirth. During the 1970's Pascal became the most widely taught programming language and it gained acceptance in science and industry. In 1980 Dr. Wirth released the Modula-2 program ming language. Modula-2 is an evolution of Pascal. It improves on the successes of Pascal while adding the MODULE - a tool for ex pressing the relations between the major parts of programs. In ad dition Modula-2 contains low-level features for systems program ming and coroutines for concurrent programming. Programming languages are important because they are used to express ideas. Some programming languages are so limited that certain ideas can't be easily expressed. For example languages that lac k floating point arithmetic are inappropriate for scientific com putations. Languages such as Basic and Fortran that lack recur sion are unsuitable for text processing or systems programming. Sometimes a programming language is useable for a certain appli cation but it is far from ideal. A good example is the difficulty of writing large programs in pure Pascal. Pascal is a poor language for large jobs because it lacks facilities for partitioning a program viii Preface 6< ; ~~~~er 0\ Sheet Metal Tube /" 0 (to Affix Eraser to Shaft) ~ Hollow Wooden Shaft A Lead Core Figure 1. An exploded diagram. into separate pieces that can be developed independently.
Constraint Logic Programming (CLP), an area of extreme research interest in recent years, extends the semantics of Prolog in such a way that the combinatorial explosion, a characteristic of most problems in the field of Artificial Intelligence, can be tackled efficiently. By employing solvers dedicated to each domain instead of the unification algorithm, CLP drastically reduces the search space of the problem, which leads to increased efficiency in the execution of logic programs. CLP offers the possibility of solving complex combinatorial problems in an efficient way, and at the same time maintains the advantages offered by the declarativeness of logic programming. The aim of this book is to present parallel and constraint logic programming, offering a basic understanding of the two fields to the reader new to the area. The first part of the book gives an introduction to the fundamental aspects of conventional logic programming which is necessary for understanding the parts that follow. The second part includes an introduction to parallel logic programming, architectures and implementations proposed in the area.Finally, the third part presents the principles of constraint logic programming. The last two parts also include descriptions of the supporting facilities for the two paradigms in two popular systems; ECLIPSe and SICStus. These platforms have been selected mainly because they offer both parallel and constraint features. Annotated and explained examples are also included in the relevant parts, offering a valuable guide and a first practical experience to the reader. Finally, applications of the covered paradigms are presented. The authors felt that a book of this kind should provide some theoretical background necessary for the understanding of the covered logic programming paradigms, and a quick start for the reader interested in writing parallel and constraint logic programming programs. However it is outside the scope of this book to provide a deep theoretical background of the two areas.In that sense, this book is addressed to a public interested in obtaining a knowledge of the domain, without spending the time and effort to understand the extensive theoretical work done in the field -- namely postgraduate and advanced undergraduate students in the area of logic programming. This book fills a gap in the current bibliography, since there is no comprehensive book of this level that covers the areas of conventional, parallel, and constraint logic programming. Parallel and Constraint Logic Programming: An Introduction to Logic, Parallelism and Constraints is appropriate for an advanced level course on Logic Programming or Constraints, and as a reference for practitioners and researchers in industry.
xv From the Old to the New xvii Acknowledgments xxi 1 Verilog - A Tutorial Introduction 1 Getting Started 2 A Structural Description 2 Simulating the binaryToESeg Driver 4 Creating Ports For the Module 7 Creating a Testbench For a Module 8 11 Behavioral Modeling of Combinational Circuits Procedural Models 12 Rules for Synthesizing Combinational Circuits 13 14 Procedural Modeling of Clocked Sequential Circuits Modeling Finite State Machines 15 Rules for Synthesizing Sequential Systems 18 Non-Blocking Assignment ("
Formal Methods for Open Object-Based Distributed Systems presents the leading edge in several related fields, specifically object-orientated programming, open distributed systems and formal methods for object-oriented systems. With increased support within industry regarding these areas, this book captures the most up-to-date information on the subject. Many topics are discussed, including the following important areas: object-oriented design and programming; formal specification of distributed systems; open distributed platforms; types, interfaces and behaviour; formalisation of object-oriented methods. This volume comprises the proceedings of the International Workshop on Formal Methods for Open Object-based Distributed Systems (FMOODS), sponsored by the International Federation for Information Processing (IFIP) which was held in Florence, Italy, in February 1999. Formal Methods for Open Object-Based Distributed Systems is suitable as a secondary text for graduate-level courses in computer science and telecommunications, and as a reference for researchers and practitioners in industry, commerce and government.
LOTOS (Language Of Temporal Ordering Specification) became an international standard in 1989, although application of preliminary versions of the language to communication services and protocols of the ISO/OSI family dates back to 1984. This history of the use of LOTOS made it apparent that more advantages than the pure production of standard reference documents were to be expected from the use of such formal description techniques. LOTOSphere: Software Development with LOTOS describes in depth a five year project that moved LOTOS out of the ISO tower into software engineering practice. LOTOS became a vehicle for efficient, yet formally based industrial software specification, design, verification, implementation and testing. LOTOSphere: Software Development with LOTOS is divided into six parts. The first introduces the reader to LOTOS and the project LOTOSphere. The five remaining each treat an important part of the software development life cycle using LOTOS. This is the first book to give a comprehensive treatment of the use of these formal description techniques in a software engineering environment. It will thus be a valuable reference for researchers and software developers and can also be used as a text for an advanced course on the subject.
This book constitutes the refereed proceedings of the 11th International Symposium on Automated Technology for Verification and Analysis, ATVA 2013, held at Hanoi, Vietnam, in October 2013. The 27 regular papers, 3 short papers and 12 tool papers presented together with 3 invited talks were carefully selected from73 submissions. The papers are organized in topical, sections on analysis and verification of hardware circuits, systems-on-chip and embedded systems, analysis of real-time, hybrid, priced/weighted and probabilistic systems, deductive, algorithmic, compositional, and abstraction/refinement techniques for analysis and verification, analytical techniques for safety, security, and dependability, testing and runtime analysis based on verification technology, analysis and verification of parallel and concurrent hardware/software systems, verification in industrial practice, and applications and case studies.
Parallel processing is seen today as the means to improve the power of computing facilities by breaking the Von Neumann bottleneck of conventional sequential computer architectures. By defining appropriate parallel computation models definite advantages can be obtained. Parallel processing is the center of the research in Europe in the field of Information Processing Systems so the CEC has funded the ESPRIT Supemode project to develop a low cost, high performance, multiprocessor machine. The result of this project is a modular, reconfigurable architecture based on !NMOS transputers: T.Node. This machine can be considered as a research, industrial and commercial success. The CEC has decided to continue to encourage manufacturers as well as research and end-users of transputers by funding other projects in this field. This book presents course papers of the Eurocourse given at the Joint Research Centre in ISPRA (Italy) from the 4th to 8 of November 1991. First we present an overview of various trends in the design of parallel architectures and specially of the T.Node with it's software development environments, new distributed system aspects and also new hardware extensions based on the !NMOS T9000 processor. In a second part, we review some real case applications in the field of image synthesis, image processing, signal processing, terrain modeling, particle physics simulation and also enhanced parallel and distributed numerical methods on T.Node. |
You may like...
Testing Object-Oriented Software - Life…
Imran Bashir, Amrit L. Goel
Hardcover
R1,527
Discovery Miles 15 270
Fundamentals of Programming Languages
Chris Chancellor
Hardcover
Introduction to Computational Economics…
Hans Fehr, Fabian Kindermann
Hardcover
R4,258
Discovery Miles 42 580
|