![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > Computer programming > Programming languages > General
Now available in paperback With Beginning C: From Novice to Professional, Fourth Edition, you'll come to understand the fundamentals of the C language and learn how to program. All you need is this book and any one of the widely available free or commercial C or C++ compilers, and you'll soon be writing real C programs. You'll learn C from the first principles, using step-by-step working examples that you'll create and execute yourself. This book will increase your programming expertise by guiding you through the development of fully working C applications that use what you've learned in a practical context. You'll also be able to strike out on your own by trying the exercises included at the end of each chapter. Pick up a copy of this book by renowned author, Ivor Horton, because: It is the only beginning-level book to cover the latest ANSI standard in C Is approachable and aimed squarely at people new to C Emphasizes writing code after the first chapter Includes substantial examples relevant to intermediate users
The topics covered in this text are those usually covered in a full year's course in finite mathematics or mathematics for liberal arts students. They correspond very closely to the topics I have taught at Western New England College to freshmen business and liberal arts students. They include set theory, logic, matrices and determinants, functions and graph ing, basic differential and integral calculus, probability and statistics, and trigonometry. Because this is an introductory text, none of these topics is dealt with in great depth. The idea is to introduce the student to some of the basic concepts in mathematics along with some of their applications. I believe that this text is self-contained and can be used successfully by any college student who has completed at least two years of high school mathematics including one year of algebra. In addition, no previous knowledge of any programming language is necessary. The distinguishing feature of this text is that the student is given the opportunity to learn the mathematical concepts via A Programming Lan guage (APL). APL was developed by Kenneth E. Iverson while he was at Harvard University and was presented in a book by Dr. Iverson entitled A i Programming Language in 1962. He invented APL for educational purpo ses. That is, APL was designed to be a consistent, unambiguous, and powerful notation for communicating mathematical ideas. In 1966, APL became available on a time-sharing system at IBM."
When I compare the books on expert systems in my library with the production expert systems I know of, I note that there are few good books on building expert systems in Prolog. Of course, the set of actual production systems is a little small for a valid statistical sample, at least at the time and place of this writing - here in Gennany, and in the first days of 1989. But there are at least some systems I have seen running in real life commercial and industrial environments, and not only at trade shows. I can observe the most impressive one in my immediate neighborhood. It is installed in the Telephone Shop of the Gennan Federal PTT near the Munich National Theater, and helps configure telephone systems and small PBXs for mostly private customers. It has a neat, graphical interface, and constructs and prices an individual telephone installation interactively before the very eyes of the customer. The hidden features of the system are even more impressive. It is part of an expert system network with a distributed knowledge base that will grow to about 150 installations in every Telephone Shop throughout Gennany. Each of them can be updated individually overnight via Teletex to present special offers or to adapt the selection process to the hardware supplies currently available at the local ware houses."
" .. .1 always worked with programming languages because it seemed to me that until you could understand those, you really couldn't understand computers. Understanding them doesn't really mean only being able to use them. A lot of people can use them without understanding them." Christopher Strachey The development of programming languages is one of the finest intellectual achievements of the new discipline called Computer Science. And yet, there is no other subject that I know of, that has such emotionalism and mystique associated with it. Thus, my attempt to write about this highly charged subject is taken with a good deal of in my role as professor I have felt the need for a caution. Nevertheless, modern treatment of this subject. Traditional books on programming languages are like abbreviated language manuals, but this book takes a fundamentally different point of view. I believe that the best possible way to study and understand today's programming languages is by focusing on a few essential concepts. These concepts form the outline for this book and include such topics as variables, expressions, statements, typing, scope, procedures, data types, exception handling and concurrency. By understanding what these concepts are and how they are realized in different programming languages, one arrives at a level of comprehension far greater than one gets by writing some programs in a xii Preface few languages. Moreover, knowledge of these concepts provides a framework for understanding future language designs.
This text is an introduction to programming in general, and a manual for programming with the language Modula-2 in particular. It is oriented primarily towards people who have already acquired some basic knowledge of programming and would like to deepen their understanding in a more structured way. Neveltheless, an introductory chapter is included for the benefit of the beginner, displaying in a concise form some of the fundamental concepts of computers and their programming. The text is therefore also suitable as a self-contained tutorial. The notation used is Modula-2, which lends itself well for a structured approach and leads the student to a working style that has generally become known under the title of structured programming. As a manual for programming in Modula-2, the text covers practically all facilities of that language. Part 1 covers the basic notions of the variable, expression, assignment, conditional and repetitive statement, and array data structure. Together with Palt 2 which introduces the important concept of the procedure or subroutine, it contains essentially the material commonly discussed in introductory programming courses. Part 3 concerns data types and structures and constitutes the essence of an advanced course on programming. Palt 4 introduces the notion of the module, a concept that is fundamental to the design of larger programmed systems and to programming as team work. The most commonly used utility programs for input and output are presented as examples of modules.
At least four research fields detennine the theoretical background of specification and deduction in computer science: recursion theory, automated theorem proving, abstract data types and tenn rewriting systems. As these areas approach each other more and more, the strong distinctions between functional and relational views, deductive and denotational approaches as well as between specification and programming are relieved in favour of their integration. The book will not expose the lines of this development; conversely, it starts out from the nucleus of Hom clause logic and brings forth both known and unknown results, most of which affect more than one of the fields mentioned above. Chapter 1 touches on historical issues of specification and prototyping and delimits the topics handled in this book from others which are at the core of related work. Chapter 2 provides the fundamental notions and notations needed for the presentation and interpretation of many-sorted Horn clause theories with equality. Chapter 3 supplies a number of sample Hom clause specifications ranging from arithmetic through string manipulation to higher data structures and interpreters of programming languages. Some of these examples serve as a reference to illustrate definitions and results, others may throw a light on the strong link between specifications and programs, which are executed by applying deduction rules. Thus we have included examples of how to use program trans/ormation methods in specification design.
You might well wonder why TFPC in Practice is a part of the Monographs in Visualization series. However, if you really think about typesetting, especially fine typesetting, you soon realize that in large part it is a visual art as well as a science. 'lEX's algorithms produce in almost all cases aesthetic results of the highest quality. On the other hand, occasionally one may want to insert some additional space before a subscript or superscript, or one may want to adjust the vertical spacing in a fraction. Fortunately Donald Knuth, the author of 'lEX, allows one to program such corrections easily where needed. The four volumes of Stephan von Bechtolsheim's long awaited TFPC in Prac tice present a comprehensive view of 'lEX. His thorough discussion of each aspect of 'lEX is liberally laced with cogent illustrative examples. Many of these exam ples represent complete, ready to use macros that enhance the capabilities of 'lEX. These examples are of particular interest to both the typesetter and the 'lEX programmer. The typesetter can often solve an immediate problem by ei ther using one of the examples directly or by making minor changes to adapt it to the problem at hand. The 'lEX programmer can use the examples, along with Stephan's detailed discussion, to increase both the depth and breadth of his or her knowledge of 'lEX. The value of the text is further enhanced by Stephan's concerted effort to explain the reasoning behind each topic or example."
The title of this book contains the words ALGORITHMIC LANGUAGE, in the singular. This is meant to convey the idea that it deals not so much with the diversity of program ming languages, but rather with their commonalities. The task of formal program develop It allows classifying ment proved to be the ideal frame for demonstrating this unity. concepts and distinguishing fundamental notions from notational features; and it leads immediately to a systematic disposition. This approach is supported by didactic, practical, and theoretical considerations. The clarity of the structure of a programming language de signed according to the principles of program transformation is remarkable. Of course there are various notations for such a language. The notation used in this book is mainly oriented towards ALGOL 68, but is also strongly influenced by PASCAL - it could equally well have been the other way round. In the appendices there are occa sional references to the styles used in ALGOL, PASCAL, LISP, and elsewhere."
Although you only have one volume in front of you, writing four volumes and 1600 pages on a single subject needs some form of justification. And then on the other hand, why write even more?! Can't, at least, the preface of something that long be short?! Very well, so let's keep it short. It is my sincere hope that the series "'lEX in Practice" will be useful for your own 'lEX work. But please, before you get started, read the "Notes on ''lEX in Practice' ," because it instructs you how to use this series. You will find these notes on pages xxvii-xxxvi. The fourth and last volume deals with two different subject areas. First of all, there are the so-called output routines which are responsible for putting together the pages as generated by 'lEX. You will be amazed at how many different things can be done with 'lEX's output routines. The second subject area we are dealing with in this volume are tables. About a hundred different tables you can choose from should provide you with a starting point in the selection of tables.
This Festschrift volume, published in honor of Carolyn Talcott on the occasion of her 70th birthday, contains a collection of papers presented at a symposium held in Menlo Park, California, USA, in November 2011. Carolyn Talcott is a leading researcher and mentor of international renown among computer scientists. She has made key contributions to a number of areas of computer science including: semantics and verification of progamming languages; foundations of actor-based systems; middleware, meta-architectures, and systems; Maude and rewriting logic; and computational biology. The 21 papers presented are organized in topical sections named: Essays on Carolyn Talcott; actors and programming languages; cyberphysical systems; middleware and meta-architectures; formal methods and reasoning tools; and computational biology.
GPSS-FORTRAN is a simulator for the simulation of discrete, continuous, and combined models. Provides a reference for GPSS-FORTRAN Version 3 and illustrates the use of the lan- guage by numerous examples.
Since the early seventies concepts of specification have become central in the whole area of computer science. Especially algebraic specification techniques for abstract data types and software systems have gained considerable importance in recent years. They have not only played a central role in the theory of data type specification, but meanwhile have had a remarkable influence on programming language design, system architectures, arid software tools and environments. The fundamentals of algebraic specification lay a basis for teaching, research, and development in all those fields of computer science where algebraic techniques are the subject or are used with advantage on a conceptual level. Such a basis, however, we do not regard to be a synopsis of all the different approaches and achievements but rather a consistently developed theory. Such a theory should mainly emphasize elaboration of basic concepts from one point of view and, in a rigorous way, reach the state of the art in the field. We understand fundamentals in this context as: 1. Fundamentals in the sense of a carefully motivated introduction to algebraic specification, which is understandable for computer scientists and mathematicians. 2. Fundamentals in the sense of mathematical theories which are the basis for precise definitions, constructions, results, and correctness proofs. 3. Fundamentals in the sense of concepts from computer science, which are introduced on a conceptual level and formalized in mathematical terms.
This Festschrift volume, published in honor of Symeon Bozapalidis on the occasion of his retirement after more than 35 years of teaching activity, focuses on the subjects taught by Symeon, namely: algebra, linear algebra, mathematical logic, number theory, automata theory, tree languages and series, algebraic semantics, and fuzzy languages. Since 1982 -- at the Aristotle University of Thessaloniki -- Symeon's main interests have been closely connected with the algebraic foundations in computer science. In particular, he contributed to the development of the theory of tree languages and series, the axiomatization of graphs, picture theory, and fuzzy languages. The volume contains 15 invited papers, written by colleagues, friends, and students of Symeon. All of the papers were carefully refereed and are connected to his research topics. Most of the papers were presented at the Workshop on Algebraic Foundations in Computer Science, held in Thessaloniki, Greece, during November 7--8, 2011.
Visual languages have long been lit pursuitofeffective communication 00 tween human and machine. Today, they are suecessfully employed for e: nd user progmmming, modeliog, rapid prototypmg, and design activities by people ofmany disciplines including arehitects, artists, children, engi neers, and scientists. Furthermore. with rapid advances ofthe Internet and Web technology, human human communication through the Web or eleo tronie mobile deviees is becoming more and moreprevalent This manuscript provides a comprehensive introduetion to diagmmmatiooI visual programming languages and the technologyofautomatie genemtion ofsnch languages. It covers a broad rangeofcontents from the underlying theoryofgraph grammars to the applications in various domains. Thecon tents were ex: l: l: aeted from the papers that my Ph. D. students and I have published in the last 10 years. and are updated and organized in a coherent fashion. The manuseript gives an in. -depth treatmentof all the topic areas. Pointers to related work and further readings are also faeilitated at the end ofeverychapterexeeptChapter 9. Rather than describing how to program visually, the manuscript discusses what are visual programming languages, and how sooh languages and their underlying foundations can be usefully applied to other fields incomputer science that need graphs as the p: rimary meansofrepresentation. Assuming the basic knowledge of computer programming and compiler co: nstruetion, the manuscript can be used as a textbook for senior orgradu ate computer science classes on visual languages, or a reference book for programming language classes, practitioners, and researchers inthe related field. The manuscript cannot be completed without the helps of many people.
This book had its genesis in the following piece of computer mail: From allegra!joan-b Tue Dec 18 89:15:54 1984 To: sola!hjb Subj ect: 1 i spm Hank, I've been talking with Mark Plotnik and Bill Gale about asking you to conduct a basic course on using the lisp machine. Mark, for instance, would really like to cover basics like the flavor system, etc. , so he could start doing his own programming without a lot of trial and error, and Bill and I would be interested in this, too. I'm quite sure that Mark Jones, Bruce, Eric and Van would also be really interested. Would you like to do it? Bill has let me know that if you'd care to set something up, he's free to meet with us anytime this week or next (although I'll only be here on Wed. next week) so we can come up with a plan. What do you think? Joan. xiv Lisp Lore (All the people and computers mentioned above work at AT&T Bell Laboratories, in Murray Hill, New Jersey. ) I agreed, with some trepidation, to try teaching such a course. It wasn't clear how I was going to explain the Lisp Machine environment to a few dozen beginners when at the time I felt I was scarcely able to keep myself afloat. Particularly since many of the "beginners" had PhD's in computer science and a decade or two of programming experience.
The growing demand for systems of ever-increasing complexity and precision has stimulated the need for higher level concepts, tools, and techniques in every area of Computer Science. Some of these areas, in particular Artificial Intelligence, Databases, and Programming Lan guages, are attempting to meet this demand by defining a new, more abstract level of system description. We call this new level conceptual in recognition of its basic conceptual nature. In Artificial Intelligence, the problem of designing an expert system is seen primarily as a problem of building a knowledge base that repre sents knowledge about an enterprise. Consequently, Knowledge Repre sentation is viewed as a central issue in Artificial Intelligence research. Database design methodologies developed during the last five years are almost unanimous in offering semantic data models in terms of which the designer directly and naturally models an enterprise before proceed ing to a detailed logical and physical database design. In Programming Languages, different forms of abstraction which allow implementation independent specifications of data, functions, and control have been a major research theme for a decade. To emphasize the common goals of these three research efforts, we call this new activity conceptual modelling."
The microcomputer has put a vast amount of computational power in the hands of the practicing chemical engineer. However, a microcomputer is of little use unless there are programs available to solve chemical engineer ing problems; In this book, I have put together a collection of BASIC pro grams that w~ll help the practicing engineer be more productive and able to solve complex problems that are normally handled on mainframe com puters. The plant engineer will find the book particularly useful. The plant en gineer is calle~ upon to investigate problems that range from simple trouble shooting to tQe detailed design of complex chemical plants. The larger proj ects are usually add-on jobs to the regular duties of keeping a chemical plant running. In t~day's business climate, answers to problems must be obtained quickly and ~ccurately. The computer is capable of testing hypothesis, thereby allo~ing engineers to evaluate alternative solutions to problems quickly and provide answers to management's questions that invariably shift like the sand~ in a desert.
Research into Fully Integrated Data Environments (FIDE) has the goal of substantially improving the quality of application systems while reducing the cost of building and maintaining them. Application systems invariably involve the long-term storage of data over months or years. Much unnecessary complexity obstructs the construction of these systems when conventional databases, file systems, operating systems, communication systems, and programming languages are used. This complexity limits the sophistication of the systems that can be built, generates operational and usability problems, and deleteriously impacts both reliability and performance. This book reports on the work of researchers in the Esprit FIDE projects to design and develop a new integrated environment to support the construction and operation of such persistent application systems. It reports on the principles they employed to design it, the prototypes they built to test it, and their experience using it.
There is an established interest in integrating databases and programming languages. This book on Data Types and Persistence evolved from the proceedings of a workshop held at the Appin in August 1985. The purpose of the Appin workshop was to focus on these two aspects: persistence and data types, and to bring together people from various disciplines who have thought about these problems. Particular topics of"interest include the design of type systems appropriate for database work, the representation of persistent objects such as data types and modules, and the provision of orthogonal persistence and certain aspects of transactions and concurrency. The programme was broken into three sessions: morning, late afternoon and evening to allow the participants to take advantage of two beautiful days in the Scottish Highlands. The financial assistance of the Science and Engineering Research Council, the National Science Foundation and International Computers Ltd. is gratefully acknowledged. We would also like to thank Isabel Graham, Anne Donnelly and Estelle Taylor for their help in organising the workshop. Finally our thanks to Pete Bailey, Ray Carick and Dave Munro for the immense task they undertook in typesetting the book. The convergence of programming languages and databases to a coherent and consistent whole requires ideas from, and adjustment in, both intellectual camps. The first group of chapters in this book present ideas and adjustments coming from the programming language research community. This community frequently discusses types and uses them as a framework for other discussions.
This book is an updated version of my Ph.D. dissertation, The AND/OR Process Model for Parallel Interpretation of Logic Programs. The three years since that paper was finished (or so I thought then) have seen quite a bit of work in the area of parallel execution models and programming languages for logic programs. A quick glance at the bibliography here shows roughly 50 papers on these topics, 40 of which were published after 1983. The main difference between the book and the dissertation is the updated survey of related work. One of the appendices in the dissertation was an overview of a Prolog implementation of an interpreter based on the AND/OR Process Model, a simulator I used to get some preliminary measurements of parallelism in logic programs. In the last three years I have been involved with three other implementations. One was written in C and is now being installed on a small multiprocessor at the University of Oregon. Most of the programming of this interpreter was done by Nitin More under my direction for his M.S. project. The other two, one written in Multilisp and the other in Modula-2, are more limited, intended to test ideas about implementing specific aspects of the model. Instead of an appendix describing one interpreter, this book has more detail about implementation included in Chapters 5 through 7, based on a combination of ideas from the four interpreters.
I am very pleased to write these few brief paragraphs introducing this book, and would like to take this opportunity to attempt to set the Toolpack project in an appropriate historical context. The Toolpack project must be considered to have actually began in the Fall of 1978, when Prof. Webb C. Miller, at a meeting at Jet Propulsion Laboratories in Pasadena, California, suggested that there be a large-scale project, called Toolpack, aimed at pulling together a comprehensive collection of mathematical software development tools. It was suggested that the project follow the pattern of other "Pack" projects, such as Eispack, Linpack, and Funpack which had assembled and systematized comprehensive collections of mathematical software in such areas as eigenvalue computation, linear equation solution and special function approximation. From the that the Toolpack project would differ significantly from beginning it was recognized these earlier "Pack" projects in that it was attempting to assemble and systematize software in an area which was not well established and understood. Thus it was not clear how to organize and integrate the tools we were to collect into Toolpack. As a consequence Toolpack became simultaneously a research project and a development project. The research was aimed at determining effective strategies for large-scale integration of large-scale software tools, and the development project was aimed at implementing these strategies and using them to put high quality tools at the disposal of working mathematical software writers.
Accompanying the book, as with all TELOS sponsored publications, is an electronic component. In this case it is a DOS-Diskette produced by one of the coauthors, Paul Wellin. This diskette consists of "Mathematica "notebooks and packages which contain the codes for all examples and exercises in the book, as well as additional materials intended to extend many ideas covered in the text. It is of great value to teachers, students, and others using this book to learn how to effectively program with "Mathematica" .
The second half of the 1970s was marked with impressive advances in array/vector architectures and vectorization techniques and compilers. This progress continued with a particular focus on vector machines until the middle of the 1980s. The major ity of supercomputers during this period were register-to-register (Cray 1) or memory-to-memory (CDC Cyber 205) vector (pipelined) machines. However, the increasing demand for higher computational rates lead naturally to parallel comput ers and software. Through the replication of autonomous processors in a coordinated system, one can skip over performance barriers due technology limitations. In princi ple, parallelism offers unlimited performance potential. Nevertheless, it is very difficult to realize this performance potential in practice. So far, we have seen only the tip of the iceberg called "parallel machines and parallel programming." Parallel programming in particular is a rapidly evolving art and, at present, highly empirical. In this book we discuss several aspects of parallel programming and parallelizing compilers. Instead of trying to develop parallel programming methodologies and paradigms, we often focus on more advanced topics assuming that the reader has an adequate background in parallel processing. The book is organized in three main parts. In the first part (Chapters 1 and 2) we set the stage and focus on program transformations and parallelizing compilers. The second part of this book (Chapters 3 and 4) discusses scheduling for parallel machines from the practical point of view macro and microtasking and supporting environments). Finally, the last part (Le."
This book constitutes the refereed proceedings of the 14th International Conference on Model Driven Engineering Languages and Systems, MODELS 2011, held in Wellington, New Zealand, in October 2011. The papers address a wide range of topics in research (foundations track) and practice (applications track). For the first time a new category of research papers, vision papers, are included presenting "outside the box" thinking. The foundations track received 167 full paper submissions, of which 34 were selected for presentation. Out of these, 3 papers were vision papers. The application track received 27 submissions, of which 13 papers were selected for presentation. The papers are organized in topical sections on model transformation, model complexity, aspect oriented modeling, analysis and comprehension of models, domain specific modeling, models for embedded systems, model synchronization, model based resource management, analysis of class diagrams, verification and validation, refactoring models, modeling visions, logics and modeling, development methods, and model integration and collaboration. |
![]() ![]() You may like...
Emerging Technologies for Innovation…
Varun Gupta, Chetna Gupta
Hardcover
R7,372
Discovery Miles 73 720
C++ How to Program: Horizon Edition
Harvey Deitel, Paul Deitel
Paperback
R1,861
Discovery Miles 18 610
Advanced Visual Basic 6 - Power…
Matthew Curland, Gary Clarke
Paperback
R1,304
Discovery Miles 13 040
|