![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > Computer software packages > Other software packages
This unusual introduction to Maple shows readers how Maple or any other computer algebra system fits naturally into a mathematically oriented work environment. Designed for mathematicians, engineers, econometricians, and other scientists, this book shows how computer algebra can enhance their theoretical work. A CD-ROM contains all the Maple worksheets presented in the book.
This upper-division laboratory supplement for courses in abstract algebra consists of several Mathematica packages programmed as a foundation for group and ring theory. Additionally, the "user's guide" illustrates the functionality of the underlying code, while the lab portion of the book reflects the contents of the Mathematica-based electronic notebooks. Students interact with both the printed and electronic versions of the material in the laboratory, and can look up details and reference information in the user's guide. Exercises occur in the stream of the text of the lab, which provides a context within which to answer, and the questions are designed to be either written into the electronic notebook, or on paper. The notebooks are available in both 2.2 and 3.0 versions of Mathematica, and run across all platforms for which Mathematica exits. A very timely and unique addition to the undergraduate abstract algebra curriculum, filling a tremendous void in the literature.
This Volume contains the Keynote, Invited and Full Contributed papers presented at COMPSTAT'98. A companion volume (Payne & Lane, 1998) contains papers describing the Short Communications and Posters. COMPSTAT is a one-week conference held every two years under the auspices of the International Association of Statistical Computing, a section of the International Statistical Institute. COMPSTAT'98 is organised by IACR-Rothamsted, IACR-Long Ashton, the University of Bristol Department of Mathematics and the University of Bath Department of Mathematical Sciences. It is taking place from 24-28 August 1998 at University of Bristol. Previous COMPSTATs (from 1974-1996) were in Vienna, Berlin, Leiden, Edinburgh, Toulouse, Prague, Rome, Copenhagen, Dubrovnik, Neuchatel, Vienna and Barcelona. The conference is the main European forum for developments at the interface between statistics and computing. This was encapsulated as follows in the COMPSTAT'98 Call for Papers. Statistical computing provides the link between statistical theory and applied statistics. The scientific programme of COMPSTAT ranges over all aspects of this link, from the development and implementation of new computer-based statistical methodology through to innovative applications and software evaluation. The programme should appeal to anyone working in statistics and using computers, whether in universities, industrial companies, research institutes or as software developers.
This "hands-on" book is for people who are interested in immediately putting Maple to work. The reader is provided with a compact, fast and surveyable guide that introduces them to the extensive capabilities of the software. The book is sufficient for standard use of Maple and will provide techniques for extending Maple for more specialized work. The author discusses the reliability of results systematically and presents ways of testing questionable results. The book allows a reader to become a user almost immediately and helps him/her to grow gradually to a broader and more proficient use. As a consequence, some subjects are dealt with in an introductory way early in the book, with references to a more detailed discussion later on.
S+SPATIALSTATS is the first comprehensive, object-oriented package for the analysis of spatial data. Providing a whole new set of analysis tools, S+SPATIALSTATS was created specifically for the exploration and modeling of spatially correlated data. It can be used to analyze data arising in areas such as environmental, mining, and petroleum engineering, natural resources, geography, epidemiology, demography, and others where data is sampled spatially. This users manual provides the documentation for the S+SPATIALSTATS module.
This companion to The New Statistical Analysis of Data by Anderson and Finn provides a hands-on guide to data analysis using SPSS. Included with this guide are instructions for obtaining the data sets to be analysed via the World Wide Web. First, the authors provide a brief review of using SPSS, and then, corresponding to the organisation of The New Statistical Analysis of Data, readers participate in analysing many of the data sets discussed in the book. In so doing, students both learn how to conduct reasonably sophisticated statistical analyses using SPSS whilst at the same time gaining an insight into the nature and purpose of statistical investigation.
Past events have shed light on the vulnerability of mission-critical computer systems at highly sensitive levels. It has been demonstrated that common hackers can use tools and techniques downloaded from the Internet to attack government and commercial information systems. Although threats may come from mischief makers and pranksters, they are more likely to result from hackers working in concert for profit, hackers working under the protection of nation states, or malicious insiders. Securing an IT Organization through Governance, Risk Management, and Audit introduces two internationally recognized bodies of knowledge: Control Objectives for Information and Related Technology (COBIT 5) from a cybersecurity perspective and the NIST Framework for Improving Critical Infrastructure Cybersecurity (CSF). Emphasizing the processes directly related to governance, risk management, and audit, the book provides details of a cybersecurity framework (CSF), mapping each of the CSF steps and activities to the methods defined in COBIT 5. This method leverages operational risk understanding in a business context, allowing the information and communications technology (ICT) organization to convert high-level enterprise goals into manageable, specific goals rather than unintegrated checklist models. The real value of this methodology is to reduce the knowledge fog that frequently engulfs senior business management, and results in the false conclusion that overseeing security controls for information systems is not a leadership role or responsibility but a technical management task. By carefully reading, implementing, and practicing the techniques and methodologies outlined in this book, you can successfully implement a plan that increases security and lowers risk for you and your organization.
Maple V Mathematics Programming Guide is the fully updated language and programming reference for Maple V Release 5. It presents a detailed description of Maple V Release 5 - the latest release of the powerful, interactive computer algebra system used worldwide as a tool for problem-solving in mathematics, the sciences, engineering, and education. This manual describes the use of both numeric and symbolic expressions, the data types available, and the programming language statements in Maple. It shows how the system can be extended or customized through user defined routines and gives complete descriptions of the system's user interface and 2D and 3D graphics capabilities.
The International Federation for Information Processing, IFIP, is a multinational federation of professional technical organisations concerned with information processing. IFIP is dedicated to improving communication and increased understanding among practitioners of all nations about the role information processing can play in all walks of life. This Working Conference, Secondary School Mathematics in the World of Communication Technologies: Learning, Teaching and the Curriculum, was organised by Working Group 3.1, Informatics in Secondary Education, ofiFIP Technical Committee for Education, TC3. This is the third conference on this theme organised by WG 3.1, the previous two were held in Varna, Bulgaria, 1977, and Sofia, Bulgaria, 1987-proceedings published by North-Holland Elsevier. The aim of the conference was to take a forward look at the issue of the relationships between mathematics and the new technologies of information and communication in the context of the increased availability of interactive and dynamic information processing tools. The main focus was on the mathematics education of students in the age range of about ll to 18 years and the following themes were addressed: * Curriculum: curriculum evolution; relationships with informatics; * Teachers: professional development; methodology and practice; * Learners: tools and techniques; concept development; research and theory; * Human and social issues: culture and policy; personal impact.
Presents the main ideas of computer-intensive statistical methods Gives the algorithms for all the methods Uses various plots and illustrations for explaining the main ideas Features the theoretical backgrounds of the main methods. Includes R codes for the methods and examples
Since the beginning of the seventies computer hardware is available to use programmable computers for various tasks. During the nineties the hardware has developed from the big main frames to personal workstations. Nowadays it is not only the hardware which is much more powerful, but workstations can do much more work than a main frame, compared to the seventies. In parallel we find a specialization in the software. Languages like COBOL for business orientated programming or Fortran for scientific computing only marked the beginning. The introduction of personal computers in the eighties gave new impulses for even further development, already at the beginning of the seven ties some special languages like SAS or SPSS were available for statisticians. Now that personal computers have become very popular the number of pro grams start to explode. Today we will find a wide variety of programs for almost any statistical purpose (Koch & Haag 1995)."
Using a visual data analysis approach, wavelet concepts are explained in a way that is intuitive and easy to understand. Furthermore, in addition to wavelets, a whole range of related signal processing techniques such as wavelet packets, local cosine analysis, and matching pursuits are covered, and applications of wavelet analysis are illustrated -including nonparametric function estimation, digital image compression, and time-frequency signal analysis. This book and software package is intended for a broad range of data analysts, scientists, and engineers. While most textbooks on the subject presuppose advanced training in mathematics, this book merely requires that readers be familiar with calculus and linear algebra at the undergraduate level.
COMPSTAT symposia have been held regularly since 1974 when they started in Vienna. This tradition has made COMPSTAT a major forum for the interplay of statistics and computer sciences with contributions from many well known scientists all over the world. The scientific programme of COMPSTAT '96 covers all aspects of this interplay, from user-experiences and evaluation of software through the development and implementation of new statistical ideas. All papers presented belong to one of the three following categories: - Statistical methods (preferable new ones) that require a substantial use of computing; - Computer environments, tools and software useful in statistics; - Applications of computational statistics in areas of substantial interest (environment, health, industry, biometrics, etc.).
A greatly expanded and heavily revised second edition, this popular
guide provides instructions and clear examples for running analyses
of variance (ANOVA) and several other related statistical tests of
significance with SPSS. No other guide offers the program
statements required for the more advanced tests in analysis of
variance. All of the programs in the book can be run using any
version of SPSS, including versions 11 and 11.5. A table at the end
of the preface indicates where each type of analysis (e.g., simple
comparisons) can be found for each type of design (e.g., mixed
two-factor design).
Now in its second edition, this textbook provides an introduction to Python and its use for statistical data analysis. It covers common statistical tests for continuous, discrete and categorical data, as well as linear regression analysis and topics from survival analysis and Bayesian statistics. For this new edition, the introductory chapters on Python, data input and visualization have been reworked and updated. The chapter on experimental design has been expanded, and programs for the determination of confidence intervals commonly used in quality control have been introduced. The book also features a new chapter on finding patterns in data, including time series. A new appendix describes useful programming tools, such as testing tools, code repositories, and GUIs. The provided working code for Python solutions, together with easy-to-follow examples, will reinforce the reader's immediate understanding of the topic. Accompanying data sets and Python programs are also available online. With recent advances in the Python ecosystem, Python has become a popular language for scientific computing, offering a powerful environment for statistical data analysis. With examples drawn mainly from the life and medical sciences, this book is intended primarily for masters and PhD students. As it provides the required statistics background, the book can also be used by anyone who wants to perform a statistical data analysis.
The emphasis of the book is given in how to construct different types of solutions (exact, approximate analytical, numerical, graphical) of numerous nonlinear PDEs correctly, easily, and quickly. The reader can learn a wide variety of techniques and solve numerous nonlinear PDEs included and many other differential equations, simplifying and transforming the equations and solutions, arbitrary functions and parameters, presented in the book). Numerous comparisons and relationships between various types of solutions, different methods and approaches are provided, the results obtained in Maple and Mathematica, facilitates a deeper understanding of the subject. Among a big number of CAS, we choose the two systems, Maple and Mathematica, that are used worldwide by students, research mathematicians, scientists, and engineers. As in the our previous books, we propose the idea to use in parallel both systems, Maple and Mathematica, since in many research problems frequently it is required to compare independent results obtained by using different computer algebra systems, Maple and/or Mathematica, at all stages of the solution process. One of the main points (related to CAS) is based on the implementation of a whole solution method (e.g. starting from an analytical derivation of exact governing equations, constructing discretizations and analytical formulas of a numerical method, performing numerical procedure, obtaining various visualizations, and comparing the numerical solution obtained with other types of solutions considered in the book, e.g. with asymptotic solution).
The book, which contains over two hundred illustrations, is designed for use in school computer labs or with home computers, running the computer algebra system Maple, or its student version. It supports the interactive Maple worksheets, which the authors have developed and which are available free of charge via anonymous ftp (ftp.utirc.utoronto.ca (/pub/ednet/maths/maple)). The book addresses readers who are learning calculus at a pre-university level.
These lecture notes provide a rapid, accessible introduction to Bayesian statistical methods. The course covers the fundamental philosophy and principles of Bayesian inference, including the reasoning behind the prior/likelihood model construction synonymous with Bayesian methods, through to advanced topics such as nonparametrics, Gaussian processes and latent factor models. These advanced modelling techniques can easily be applied using computer code samples written in Python and Stan which are integrated into the main text. Importantly, the reader will learn methods for assessing model fit, and to choose between rival modelling approaches.
This book is a collection of thirty invited papers, covering the important parts of a rapidly developing area like "computational statistics." All contributions supply information about a specialized topic in a tutorial and comprehensive style. Newest results and developments are discussed. Starting with the foundations of computational statistics, i.e. numerical reliability of software packages or construction principles for pseudorandom number generators, the volume includes design considerations on statistical programming languages and the basic issues of resampling techniques. Also covered are areas like design of experiments, graphical techniques, modelling and testing problems, a review of clustering algorithms, and concise discussions of regression trees or cognitive aspects of authoring systems.
Mathematica combines symbolic and numerical calculations, plots, graphics programming, list calculations and structured documentation into an interactive environment. This book covers the program and shows with practical examples how even more complex problems can be solved with just a few commands. From the reviews: "A valuable introductory textbook on Mathematica and is very useful to scientists and engineers who use Mathematica in their work." -- ZENTRALBLATT MATH
This book assembles papers which were presented at the biennial sympo sium in Computational Statistics held und er the a uspices of the International Association for Statistical Computing (IASC), a section of ISI, the Interna tional Statistical Institute. This symposium named COMPSTAT '94 was organized by the Statistical Institutes of the University of Vienna and the University of Technology of Vienna, Austria. The series of COMPSTAT Symposia started 1974 in Vienna. Mean while they took place every other year in Berlin (Germany, 1976), Leiden (The Netherlands, 1978), Edinburgh (Great Britain, 1980), Toulouse (France, 1982), Prague (Czechoslovakia, 1984), Rom (Italy, 1986), Copenhagen (Den mark, 1988), Dubrovnik (Yugoslavia, 1990) and Neuchatel (Switzerland, 1992). This year we are celebrating the 20th anniversary in Vienna, Austria. It has obviously been observed a movement from "traditional" computa tional statistics with emphasis on methods which produce results quickly and reliably, to computationally intensive methods like resampling procedures, Bayesian methods, dynamic graphics, to very recent areas like neural net works, accentuation on spatial statistics, huge data sets, analysis strategies, etc. For the organization of the symposium, new guidelines worked out by the IASC in written form were in effect this time. The goal was to refresh somehow the spirit of the start of COMPSTAT '74, keep the tradition of the series and ensure a certain continuity in the sequence of biannual meetings."
The analysis of time series data is an important aspect of data analysis across a wide range of disciplines, including statistics, mathematics, business, engineering, and the natural and social sciences. This package provides both an introduction to time series analysis and an easy-to-use version of a well-known time series computing package called Interactive Time Series Modelling. The programs in the package are intended as a supplement to the text Time Series: Theory and Methods, 2nd edition, also by Peter J. Brockwell and Richard A. Davis. Many researchers and professionals will appreciate this straightforward approach enabling them to run desk-top analyses of their time series data. Amongst the many facilities available are tools for: ARIMA modelling, smoothing, spectral estimation, multivariate autoregressive modelling, transfer-function modelling, forecasting, and long-memory modelling. This version is designed to run under Microsoft Windows 3.1 or later. It comes with two diskettes: one suitable for less powerful machines (IBM PC 286 or later with 540K available RAM and 1.1 MB of hard disk space) and one for more powerful machines (IBM PC 386 or later with 8MB of RAM and 2.6 MB of hard disk space available).
Maple is a computer algebraic system with a fast-growing number of users in universities, schools and other institutions. Werner Burkhardt provides a detailed step-by-step introduction for all first-time users, enabling you to become familiar with the way Maple works, as quickly and easily as possible. Using as examples problems from many different aspects of mathematics, problem solving using Maple is fully described in this easy-to-follow tutorial text. Each chapter is self-contained, so you can easily select areas of your own special interest. There are some 'test yourself' problems at the end of each chapter to check your progress, with solutions provided at the end of the book.
Master the tools of MATLAB through hands-on examplesShows How to Solve Math Problems Using MATLAB The mathematical software MATLAB (R) integrates computation, visualization, and programming to produce a powerful tool for a number of different tasks in mathematics. Focusing on the MATLAB toolboxes especially dedicated to science, finance, and engineering, MATLAB (R) with Applications to Engineering, Physics and Finance explains how to perform complex mathematical tasks with relatively simple programs. This versatile book is accessible enough for novices and users with only a fundamental knowledge of MATLAB, yet covers many sophisticated concepts to make it helpful for experienced users as well. The author first introduces the basics of MATLAB, describing simple functions such as differentiation, integration, and plotting. He then addresses advanced topics, including programming, producing executables, publishing results directly from MATLAB programs, and creating graphical user interfaces. The text also presents examples of Simulink (R) that highlight the advantages of using this software package for system modeling and simulation. The applications-dedicated chapters at the end of the book explore the use of MATLAB in digital signal processing, chemical and food engineering, astronomy, optics, financial derivatives, and much more.
Design and Analysis of Experiments with R presents a unified treatment of experimental designs and design concepts commonly used in practice. It connects the objectives of research to the type of experimental design required, describes the process of creating the design and collecting the data, shows how to perform the proper analysis of the data, and illustrates the interpretation of results. Drawing on his many years of working in the pharmaceutical, agricultural, industrial chemicals, and machinery industries, the author teaches students how to: Make an appropriate design choice based on the objectives of a research project Create a design and perform an experiment Interpret the results of computer data analysis The book emphasizes the connection among the experimental units, the way treatments are randomized to experimental units, and the proper error term for data analysis. R code is used to create and analyze all the example experiments. The code examples from the text are available for download on the author's website, enabling students to duplicate all the designs and data analysis. Intended for a one-semester or two-quarter course on experimental design, this text covers classical ideas in experimental design as well as the latest research topics. It gives students practical guidance on using R to analyze experimental data. |
![]() ![]() You may like...
Research, Practices, and Innovations in…
Kenneth David Strang, Maximiliano E. Korstanje, …
Hardcover
R6,611
Discovery Miles 66 110
Intelligent Systems for Machine…
Evor L Hines, Mark S Leeson
Hardcover
R4,886
Discovery Miles 48 860
|