![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > Computer programming > Programming languages
Object relationships in modern software systems are becoming increasingly numerous and complex, and program errors due to violations of object relationships are difficult to detect. Programmers need new tools that allow them to explore objects in a large system more efficiently and to detect broken object relationships instantaneously. Such tools incorporate approaches used in such areas as data visualization, pattern matching and extraction, database querying, active databases, and rule-based programming. The query-based debugging approach developed by the author of this book is another powerful yet efficient tool to be added to the developer's tool chest. Advanced Debugging Methods presents practice and tools for debugging computer programs. This book proposes new powerful approaches that simplify the daunting task of debugging complex software systems. Although debugging has been addressed in numerous research papers, many of its methods have yet to be explored in a book-length format. This book helps to fill this gap by presenting an overview of existing debugging tools with motivating examples and case studies, as well as presenting new, state-of-the-art debugging methods. Advanced Debugging Methods will be of use to software developers looking for tools to be applied in cutting edge practice; system architects looking at the relationship between software design and debugging; tools and programming language researchers looking for new ideas in run-time tool implementation as well as detailed descriptions of advanced implementations; and university professors and graduate students who will use this book as supplementary reading for graduate courses in programming tools, language implementation, and advanced object-oriented systems. Advanced Debugging Methods is also a handy reference of currently existing debugging methodologies as well as a springboard for cutting-edge research to simplify the difficult task of debugging and to facilitate the development of more robust software systems.
This volume contains a selection of papers that focus on the state-of the-art in formal specification and verification of real-time computing systems. Preliminary versions of these papers were presented at a workshop on the foundations of real-time computing sponsored by the Office of Naval Research in October, 1990 in Washington, D. C. A companion volume by the title Foundations of Real-Time Computing: Scheduling and Resource Management complements this hook by addressing many of the recently devised techniques and approaches for scheduling tasks and managing resources in real-time systems. Together, these two texts provide a comprehensive snapshot of current insights into the process of designing and building real time computing systems on a scientific basis. The notion of real-time system has alternative interpretations, not all of which are intended usages in this collection of papers. Different communities of researchers variously use the term real-time to refer to either very fast computing, or immediate on-line data acquisition, or deadline-driven computing. This text is concerned with the formal specification and verification of computer software and systems whose correct performance is dependent on carefully orchestrated interactions with time, e. g., meeting deadlines and synchronizing with clocks. Such systems have been enabled for a rapidly increasing set of diverse end-uses by the unremitting advances in computing power per constant-dollar cost and per constant-unit-volume of space. End use applications of real-time computers span a spectrum that includes transportation systems, robotics and manufacturing, aerospace and defense, industrial process control, and telecommunications."
In brief summary, the following results were presented in this work: * A linear time approach was developed to find register requirements for any specified CS schedule or filled MRT. * An algorithm was developed for finding register requirements for any kernel that has a dependence graph that is acyclic and has no data reuse on machines with depth independent instruction templates. * We presented an efficient method of estimating register requirements as a function of pipeline depth. * We developed a technique for efficiently finding bounds on register require ments as a function of pipeline depth. * Presented experimental data to verify these new techniques. * discussed some interesting design points for register file size on a number of different architectures. REFERENCES [1] Robert P. Colwell, Robert P. Nix, John J O'Donnell, David B Papworth, and Paul K. Rodman. A VLIW Architecture for a Trace Scheduling Com piler. In Architectural Support for Programming Languages and Operating Systems, pages 180-192, 1982. [2] C. Eisenbeis, W. Jalby, and A. Lichnewsky. Compile-Time Optimization of Memory and Register Usage on the Cray-2. In Proceedings of the Second Workshop on Languages and Compilers, Urbana l/inois, August 1989. [3] C. Eisenbeis, William Jalby, and Alain Lichnewsky. Squeezing More CPU Performance Out of a Cray-2 by Vector Block Scheduling. In Proceedings of Supercomputing '88, pages 237-246, 1988. [4] Michael J. Flynn. Very High-Speed Computing Systems. Proceedings of the IEEE, 54:1901-1909, December 1966.
Distributed and Parallel Systems: From Instruction Parallelism to Cluster Computing is the proceedings of the third Austrian-Hungarian Workshop on Distributed and Parallel Systems organized jointly by the Austrian Computer Society and the MTA SZTAKI Computer and Automation Research Institute. This book contains 18 full papers and 12 short papers from 14 countries around the world, including Japan, Korea and Brazil. The paper sessions cover a broad range of research topics in the area of parallel and distributed systems, including software development environments, performance evaluation, architectures, languages, algorithms, web and cluster computing. This volume will be useful to researchers and scholars interested in all areas related to parallel and distributed computing systems.
I felt deeply honored when Professor Sumit Ghosh asked me to write the foreword to his book with an extraordinary perspective. I have long admired him, ?rst as a student leader at Stanford, where he initiated the ?rst IEEE Computer Society's student chapter, and later as an esteemed and inspiring friend whose transdisciplinary research broadened and enhanced the horizons of practitioners of computer science and engineering, including my own. His ideas, which are derived from his profound vision, deep critical thinking, and personal intuition, reach from information technology to bioscience, as - hibited in this excellent book. To me, an ordinary engineer, it opens up a panoramic view of the Universe of Knowledge that keeps expanding and - spiring,likethegoodIndianproverb,whichsays,"agoodbookinformsyou,an excellent book teaches you, and a great book changes you. " I sincerely believe that Professor Ghosh's book will help us change and advance the methods of systems engineering and technology. Vision Inspired vision sees ahead of others what will or may come to be, a vivid, imagined concept or anticipation. An inspired vision personi?es what is good and what like-minded individuals hope for. Our vision is one of creating an Internet of minds, where minds are Web sites or knowledge centers, which create, store, and radiate knowledge through interaction with other minds connected by a universal shared network. This vision will not just hasten the death of distance, but will also - carcerate ignorance.
Bayesian Networks in R with Applications in Systems Biology is unique as it introduces the reader to the essential concepts in Bayesian network modeling and inference in conjunction with examples in the open-source statistical environment R. The level of sophistication is also gradually increased across the chapters with exercises and solutions for enhanced understanding for hands-on experimentation of the theory and concepts. The application focuses on systems biology with emphasis on modeling pathways and signaling mechanisms from high-throughput molecular data. Bayesian networks have proven to be especially useful abstractions in this regard. Their usefulness is especially exemplified by their ability to discover new associations in addition to validating known ones across the molecules of interest. It is also expected that the prevalence of publicly available high-throughput biological data sets may encourage the audience to explore investigating novel paradigms using the approaches presented in the book.
This book constitutes the refereed proceedings of the 2nd International Conference on Model and Data Engineering, MEDI 2012, held in Poitiers, France, in October 2012. The 12 revised full papers presented together with 5 short papers were carefully reviewed and selected from 35 submissions. The papers are cover the topics of model driven engineering, ontology engineering, formal modeling, security, and data mining.
A Formal Approach to Hardware Design discusses designing computations to be realised by application specific hardware. It introduces a formal design approach based on a high-level design language called Synchronized Transitions. The models created using Synchronized Transitions enable the designer to perform different kinds of analysis and verification based on descriptions in a single language. It is, for example, possible to use exactly the same design description both for mechanically supported verification and synthesis. Synchronized Transitions is supported by a collection of public domain CAD tools. These tools can be used with the book in presenting a course on the subject. A Formal Approach to Hardware Design illustrates the benefits to be gained from adopting such techniques, but it does so without assuming prior knowledge of formal design methods. The book is thus not only an excellent reference, it is also suitable for use by students and practitioners.
This book constitutes the refereed proceedings of the 15th International Conference on Model Driven Engineering Languages and Systems, MODELS 2012, held in Innsbruck, Austria, in September/October 2012. The 50 papers presented in this volume were carefully reviewed and selected from a total of 181 submissions. They are organized in topical sections named: metamodels and domain specific modeling; models at runtime; model management; modeling methods and tools, consistency analysis, software product lines; foundations of modeling; static analysis techniques; model testing and simulation; model transformation; model matching, tracing and synchronization; modeling practices and experience; and model analysis.
Parsing technology traditionally consists of two branches, which correspond to the two main application areas of context-free grammars and their generalizations. Efficient deterministic parsing algorithms have been developed for parsing programming languages, and quite different algorithms are employed for analyzing natural language. The Functional Treatment of Parsing provides a functional framework within which the different traditional techniques are restated and unified. The resulting theory provides new recursive implementations of parsers for context-free grammars. The new implementations, called recursive ascent parsers, avoid explicit manipulation of parse stacks and parse matrices, and are in many ways superior to conventional implementations. They are applicable to grammars for programming languages as well as natural languages. The book has been written primarily for students and practitioners of parsing technology. With its emphasis on modern functional methods, however, the book will also be of benefit to scientists interested in functional programming. The Functional Treatment of Parsing is an excellent reference and can be used as a text for a course on the subject.
This book constitutes the thoroughly refereed post-conference proceedings of the 7th International Haifa Verification Conference, HVC 2011, held in Haifa, Israel in December 2011. The 15 revised full papers presented together with 3 tool papers and 4 posters were carefully reviewed and selected from 43 submissions. The papers are organized in topical sections on synthesis, formal verification, software quality, testing and coverage, experience and tools, and posters- student event.
It is recognized that formal design and verification methods are an important requirement for the attainment of high quality system designs. The field has evolved enormously during the last few years, resulting in the fact that formal design and verification methods are nowadays supported by several tools, both commercial and academic. If different tools and users are to generate and read the same language then it is necessary that the same semantics is assigned by them to all constructs and elements of the language. The current IEEE standard VHDL language reference manual (LRM) tries to define VHDL as well as possible in a descriptive way, explaining the semantics in English. But rigor and clarity are very hard to maintain in a semantics defined in this way, and that has already given rise to many misconceptions and contradictory interpretations. Formal Semantics for VHDL is the first book that puts forward a cohesive set of semantics for the VHDL language. The chapters describe several semantics each based on a different underlying formalism: two of them use Petri nets as target language, and two of them higher order logic. Two use functional concepts, and finally another uses the concept of evolving algebras. Formal Semantics for VHDL is essential reading for researchers in formal methods and can be used as a text for an advanced course on the subject.
This book constitutes the proceedings of the 16th Brazililan Symposium on Programming Languages, SBLP 2012, held in Natal, Brazil, in September 2012. The 10 full and 2 short papers were carefully reviewed and selected from 27 submissions. The papers cover various aspects of programming languages and software engineering.
This book constitutes the thoroughly refereed proceedings of the 10th International Symposium on Automated Technology for Verification and Analysis, ATVA 2012, held at Thiruvananthapuram, Kerala, India, in October 2012. The 25 regular papers, 3 invited papers and 4 tool papers presented were carefully selected from numerous submissions. Conference papers are organized in 9 technical sessions, covering the topics of automata theory, logics and proofs, model checking, software verification, synthesis, verification and parallelism, probabilistic verification, constraint solving and applications, and probabilistic systems.
The theory of constructive (recursive) models follows from works of Froehlich, Shepherdson, Mal'tsev, Kuznetsov, Rabin, and Vaught in the 50s. Within the framework of this theory, algorithmic properties of abstract models are investigated by constructing representations on the set of natural numbers and studying relations between algorithmic and structural properties of these models. This book is a very readable exposition of the modern theory of constructive models and describes methods and approaches developed by representatives of the Siberian school of algebra and logic and some other researchers (in particular, Nerode and his colleagues). The main themes are the existence of recursive models and applications to fields, algebras, and ordered sets (Ershov), the existence of decidable prime models (Goncharov, Harrington), the existence of decidable saturated models (Morley), the existence of decidable homogeneous models (Goncharov and Peretyat'kin), properties of the Ehrenfeucht theories (Millar, Ash, and Reed), the theory of algorithmic dimension and conditions of autostability (Goncharov, Ash, Shore, Khusainov, Ventsov, and others), and the theory of computable classes of models with various properties. Future perspectives of the theory of constructive models are also discussed. Most of the results in the book are presented in monograph form for the first time. The theory of constructive models serves as a basis for recursive mathematics. It is also useful in computer science, in particular, in the study of programming languages, higher level languages of specification, abstract data types, and problems of synthesis and verification of programs. Therefore, the book will be useful for not only specialists in mathematical logic and the theory of algorithms but also for scientists interested in the mathematical fundamentals of computer science. The authors are eminent specialists in mathematical logic. They have established fundamental results on elementary theories, model theory, the theory of algorithms, field theory, group theory, applied logic, computable numberings, the theory of constructive models, and the theoretical computer science.
A growing concern of mine has been the unrealistic expectations for new computer-related technologies introduced into all kinds of organizations. Unrealistic expectations lead to disappointment, and a schizophrenic approach to the introduction of new technologies. The UNIX and real-time UNIX operating system technologies are major examples of emerging technologies with great potential benefits but unrealistic expectations. Users want to use UNIX as a common operating system throughout large segments of their organizations. A common operating system would decrease software costs by helping to provide portability and interoperability between computer systems in today's multivendor environments. Users would be able to more easily purchase new equipment and technologies and cost-effectively reuse their applications. And they could more easily connect heterogeneous equipment in different departments without having to constantly write and rewrite interfaces. On the other hand, many users in various organizations do not understand the ramifications of general-purpose versus real-time UNIX. Users tend to think of "real-time" as a way to handle exotic heart-monitoring or robotics systems. Then these users use UNIX for transaction processing and office applications and complain about its performance, robustness, and reliability. Unfortunately, the users don't realize that real-time capabilities added to UNIX can provide better performance, robustness and reliability for these non-real-time applications. Many other vendors and users do realize this, however. There are indications even now that general-purpose UNIX will go away as a separate entity. It will be replaced by a real-time UNIX. General-purpose UNIX will exist only as a subset of real-time UNIX.
This book is a revised edition of the monograph which appeared under the same title in the series Research Notes in Theoretical Computer Science, Pit man, in 1986. In addition to a general effort to improve typography, English, and presentation, the main novelty of this second edition is the integration of some new material. Part of it is mine (mostly jointly with coauthors). Here is brief guide to these additions. I have augmented the account of categorical combinatory logic with a description of the confluence properties of rewriting systems of categor ical combinators (Hardin, Yokouchi), and of the newly developed cal culi of explicit substitutions (Abadi, Cardelli, Curien, Hardin, Levy, and Rios), which are similar in spirit to the categorical combinatory logic, but are closer to the syntax of A-calculus (Section 1.2). The study of the full abstraction problem for PCF and extensions of it has been enriched with a new full abstraction result: the model of sequential algorithms is fully abstract with respect to an extension of PCF with a control operator (Cartwright, Felleisen, Curien). An order extensional model of error-sensitive sequential algorithms is also fully abstract for a corresponding extension of PCF with a control operator and errors (Sections 2.6 and 4.1). I suggest that sequential algorithms lend themselves to a decomposition of the function spaces that leads to models of linear logic (Lamarche, Curien), and that connects sequentiality with games (Joyal, Blass, Abramsky) (Sections 2.1 and 2.6)."
This book constitutes the thoroughly refereed proceedings of the 19th International Symposium on Static Analysis, SAS 2012, held in Deauville, France, in September 2012. The 25 revised full papers presented together with 4 invited talks were selected from 62 submissions. The papers address all aspects of static analysis, including abstract domains, abstract interpretation, abstract testing, bug detection, data flow analysis, model checking, new applications, program transformation, program verification, security analysis, theoretical frameworks, and type checking.
Domain theory is a rich interdisciplinary area at the intersection of logic, computer science, and mathematics. This volume contains selected papers presented at the International Symposium on Domain Theory which took place in Shanghai in October 1999. Topics of papers range from the encounters between topology and domain theory, sober spaces, Lawson topology, real number computability and continuous functionals to fuzzy modelling, logic programming, and pi-calculi. This book is a valuable reference for researchers and students interested in this rapidly developing area of theoretical computer science.
This book constitutes the thoroughly refereed post-workshop proceedings of the 9th International Workshop on Rewriting Logic and its Applications, WRLA 2012, held as a satellite event of ETAPS 2012, in Tallinn, Estonia, in March 2012. The 8 revised full papers presented together with 4 invited papers were carefully reviewed and selected from 12 initial submissions and 5 invited lectures. The papers address a great diversity of topics in the fields of rewriting logic such as: foundations and models, languages, logical and semantic framework, model-based software engineering, real-time and probabilistic extensions, verification techniques, and distributed systems.
Formal Languages and Applications provides a comprehensive study-aid and self-tutorial for graduates students and researchers. The main results and techniques are presented in an readily accessible manner and accompanied by many references and directions for further research. This carefully edited monograph is intended to be the gateway to formal language theory and its applications, so it is very useful as a review and reference source of information in formal language theory.
The two-volume set LNCS 7609 and 7610 constitutes the thoroughly refereed proceedings of the 5th International Symposium on Leveraging Applications of Formal Methods, Verification and Validation, held in Heraklion, Crete, Greece, in October 2012. The two volumes contain papers presented in the topical sections on adaptable and evolving software for eternal systems, approaches for mastering change, runtime verification: the application perspective, model-based testing and model inference, learning techniques for software verification and validation, LearnLib tutorial: from finite automata to register interface programs, RERS grey-box challenge 2012, Linux driver verification, bioscientific data processing and modeling, process and data integration in the networked healthcare, timing constraints: theory meets practice, formal methods for the developent and certification of X-by-wire control systems, quantitative modelling and analysis, software aspects of robotic systems, process-oriented geoinformation systems and applications, handling heterogeneity in formal development of HW and SW Systems.
This book constitutes the thoroughly refereed post-conference proceedings of the 24th International Workshop on Languages and Compilers for Parallel Computing, LCPC 2011, held in Fort Collins, CO, USA, in September 2011. The 19 revised full papers presented and 19 poster papers were carefully reviewed and selected from 52 submissions. The scope of the workshop spans the theoretical and practical aspects of parallel and high-performance computing, and targets parallel platforms including concurrent, multithreaded, multicore, accelerator, multiprocessor, and cluster systems.
This book constitutes the thoroughly refereed proceedings of the 18th International Conference, Euro-Par 2012, held in Rhodes Islands, Greece, in August 2012. The 75 revised full papers presented were carefully reviewed and selected from 228 submissions. The papers are organized in topical sections on support tools and environments; performance prediction and evaluation; scheduling and load balancing; high-performance architectures and compilers; parallel and distributed data management; grid, cluster and cloud computing; peer to peer computing; distributed systems and algorithms; parallel and distributed programming; parallel numerical algorithms; multicore and manycore programming; theory and algorithms for parallel computation; high performance network and communication; mobile and ubiquitous computing; high performance and scientific applications; GPU and accelerators computing.
The European Summer School in Logic, Language and Information (ESSLLI) is organized every year by the Association for Logic, Language and Information (FoLLI) in different sites around Europe. The main focus of ESSLLI is on the interface between linguistics, logic and computation. ESSLLI offers foundational, introductory and advanced courses, as well as workshops, covering a wide variety of topics within the three areas of interest: Language and Computation, Language and Logic, and Logic and Computation. During two weeks, around 50 courses and 10 workshops are offered to the attendants, each of 1.5 hours per day during a five days week, with up to seven parallel sessions. ESSLLI also includes a student session (papers and posters by students only, 1.5 hour per day during the two weeks) and four evening lectures by senior scientists in the covered areas. The 15 revised full papers presented were carefully reviewed and selected. The papers are organized in topical sections on The papers are organized in topical sections on language and computation; logic and computation; and logic and language. |
![]() ![]() You may like...
Dark Silicon and Future On-chip Systems…
Suyel Namasudra, Hamid Sarbazi-Azad
Hardcover
R4,084
Discovery Miles 40 840
Java How to Program, Late Objects…
Paul Deitel, Harvey Deitel
Paperback
Advanced Visual Basic 6 - Power…
Matthew Curland, Gary Clarke
Paperback
R1,304
Discovery Miles 13 040
|