![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > Computer software packages > Other software packages > Mathematical & statistical software
The quantity, diversity and availability of transport data is increasing rapidly, requiring new skills in the management and interrogation of data and databases. Recent years have seen a new wave of 'big data', 'Data Science', and 'smart cities' changing the world, with the Harvard Business Review describing Data Science as the "sexiest job of the 21st century". Transportation professionals and researchers need to be able to use data and databases in order to establish quantitative, empirical facts, and to validate and challenge their mathematical models, whose axioms have traditionally often been assumed rather than rigorously tested against data. This book takes a highly practical approach to learning about Data Science tools and their application to investigating transport issues. The focus is principally on practical, professional work with real data and tools, including business and ethical issues. "Transport modeling practice was developed in a data poor world, and many of our current techniques and skills are building on that sparsity. In a new data rich world, the required tools are different and the ethical questions around data and privacy are definitely different. I am not sure whether current professionals have these skills; and I am certainly not convinced that our current transport modeling tools will survive in a data rich environment. This is an exciting time to be a data scientist in the transport field. We are trying to get to grips with the opportunities that big data sources offer; but at the same time such data skills need to be fused with an understanding of transport, and of transport modeling. Those with these combined skills can be instrumental at providing better, faster, cheaper data for transport decision- making; and ultimately contribute to innovative, efficient, data driven modeling techniques of the future. It is not surprising that this course, this book, has been authored by the Institute for Transport Studies. To do this well, you need a blend of academic rigor and practical pragmatism. There are few educational or research establishments better equipped to do that than ITS Leeds". - Tom van Vuren, Divisional Director, Mott MacDonald "WSP is proud to be a thought leader in the world of transport modelling, planning and economics, and has a wide range of opportunities for people with skills in these areas. The evidence base and forecasts we deliver to effectively implement strategies and schemes are ever more data and technology focused a trend we have helped shape since the 1970's, but with particular disruption and opportunity in recent years. As a result of these trends, and to suitably skill the next generation of transport modellers, we asked the world-leading Institute for Transport Studies, to boost skills in these areas, and they have responded with a new MSc programme which you too can now study via this book." - Leighton Cardwell, Technical Director, WSP. "From processing and analysing large datasets, to automation of modelling tasks sometimes requiring different software packages to "talk" to each other, to data visualization, SYSTRA employs a range of techniques and tools to provide our clients with deeper insights and effective solutions. This book does an excellent job in giving you the skills to manage, interrogate and analyse databases, and develop powerful presentations. Another important publication from ITS Leeds." - Fitsum Teklu, Associate Director (Modelling & Appraisal) SYSTRA Ltd "Urban planning has relied for decades on statistical and computational practices that have little to do with mainstream data science. Information is still often used as evidence on the impact of new infrastructure even when it hardly contains any valid evidence. This book is an extremely welcome effort to provide young professionals with the skills needed to analyse how cities and transport networks actually work. The book is also highly relevant to anyone who will later want to build digital solutions to optimise urban travel based on emerging data sources". - Yaron Hollander, author of "Transport Modelling for a Complete Beginner"
This book provides practical applications of doubly classified models by using R syntax to generate the models. It also presents these models in symbolic tables so as to cater to those who are not mathematically inclined, while numerous examples throughout the book illustrate the concepts and their applications. For those who are not aware of this modeling approach, it serves as a good starting point to acquire a basic understanding of doubly classified models. It is also a valuable resource for academics, postgraduate students, undergraduates, data analysts and researchers who are interested in examining square contingency tables.
This book presents new findings on nonregular statistical estimation. Unlike other books on this topic, its major emphasis is on helping readers understand the meaning and implications of both regularity and irregularity through a certain family of distributions. In particular, it focuses on a truncated exponential family of distributions with a natural parameter and truncation parameter as a typical nonregular family. This focus includes the (truncated) Pareto distribution, which is widely used in various fields such as finance, physics, hydrology, geology, astronomy, and other disciplines. The family is essential in that it links both regular and nonregular distributions, as it becomes a regular exponential family if the truncation parameter is known. The emphasis is on presenting new results on the maximum likelihood estimation of a natural parameter or truncation parameter if one of them is a nuisance parameter. In order to obtain more information on the truncation, the Bayesian approach is also considered. Further, the application to some useful truncated distributions is discussed. The illustrated clarification of the nonregular structure provides researchers and practitioners with a solid basis for further research and applications.
Statistics with JMP: Hypothesis Tests, ANOVA and Regression Peter Goos, University of Leuven and University of Antwerp, Belgium David Meintrup, University of Applied Sciences Ingolstadt, Germany A first course on basic statistical methodology using JMP This book provides a first course on parameter estimation (point estimates and confidence interval estimates), hypothesis testing, ANOVA and simple linear regression. The authors approach combines mathematical depth with numerous examples and demonstrations using the JMP software. Key features: * Provides a comprehensive and rigorous presentation of introductory statistics that has been extensively classroom tested. * Pays attention to the usual parametric hypothesis tests as well as to non-parametric tests (including the calculation of exact p-values). * Discusses the power of various statistical tests, along with examples in JMP to enable in-sight into this difficult topic. * Promotes the use of graphs and confidence intervals in addition to p-values. * Course materials and tutorials for teaching are available on the book's companion website. Masters and advanced students in applied statistics, industrial engineering, business engineering, civil engineering and bio-science engineering will find this book beneficial. It also provides a useful resource for teachers of statistics particularly in the area of engineering.
This proceedings volume contains eight selected papers that were presented in the International Symposium in Statistics (ISS) 2015 On Advances in Parametric and Semi-parametric Analysis of Multivariate, Time Series, Spatial-temporal, and Familial-longitudinal Data, held in St. John's, Canada from July 6 to 8, 2015. The main objective of the ISS-2015 was the discussion on advances and challenges in parametric and semi-parametric analysis for correlated data in both continuous and discrete setups. Thus, as a reflection of the theme of the symposium, the eight papers of this proceedings volume are presented in four parts. Part I is comprised of papers examining Elliptical t Distribution Theory. In Part II, the papers cover spatial and temporal data analysis. Part III is focused on longitudinal multinomial models in parametric and semi-parametric setups. Finally Part IV concludes with a paper on the inferences for longitudinal data subject to a challenge of important covariates selection from a set of large number of covariates available for the individuals in the study.
This book provides a friendly introduction to the paradigm and proposes a broad panorama of killing applications of the Infinity Computer in optimization: radically new numerical algorithms, great theoretical insights, efficient software implementations, and interesting practical case studies. This is the first book presenting to the readers interested in optimization the advantages of a recently introduced supercomputing paradigm that allows to numerically work with different infinities and infinitesimals on the Infinity Computer patented in several countries. One of the editors of the book is the creator of the Infinity Computer, and another editor was the first who has started to use it in optimization. Their results were awarded by numerous scientific prizes. This engaging book opens new horizons for researchers, engineers, professors, and students with interests in supercomputing paradigms, optimization, decision making, game theory, and foundations of mathematics and computer science. "Mathematicians have never been comfortable handling infinities... But an entirely new type of mathematics looks set to by-pass the problem... Today, Yaroslav Sergeyev, a mathematician at the University of Calabria in Italy solves this problem... " MIT Technology Review "These ideas and future hardware prototypes may be productive in all fields of science where infinite and infinitesimal numbers (derivatives, integrals, series, fractals) are used." A. Adamatzky, Editor-in-Chief of the International Journal of Unconventional Computing. "I am sure that the new approach ... will have a very deep impact both on Mathematics and Computer Science." D. Trigiante, Computational Management Science. "Within the grossone framework, it becomes feasible to deal computationally with infinite quantities, in a way that is both new (in the sense that previously intractable problems become amenable to computation) and natural". R. Gangle, G. Caterina, F. Tohme, Soft Computing. "The computational features offered by the Infinity Computer allow us to dynamically change the accuracy of representation and floating-point operations during the flow of a computation. When suitably implemented, this possibility turns out to be particularly advantageous when solving ill-conditioned problems. In fact, compared with a standard multi-precision arithmetic, here the accuracy is improved only when needed, thus not affecting that much the overall computational effort." P. Amodio, L. Brugnano, F. Iavernaro & F. Mazzia, Soft Computing
Data Presentation with SPSS Explained provides students with all the information they need to conduct small scale analysis of research projects using SPSS and present their results appropriately in their reports. Quantitative data can be collected in the form of a questionnaire, survey or experimental study. This book focuses on presenting this data clearly, in the form of tables and graphs, along with creating basic summary statistics. Data Presentation with SPSS Explained uses an example survey that is clearly explained step-by-step throughout the book. This allows readers to follow the procedures, and easily apply each step in the process to their own research and findings. No prior knowledge of statistics or SPSS is assumed, and everything in the book is carefully explained in a helpful and user-friendly way using worked examples. This book is the perfect companion for students from a range of disciplines including psychology, business, communication, education, health, humanities, marketing and nursing - many of whom are unaware that this extremely helpful program is available at their institution for their use.
This book expounds the principle and related applications of nonlinear principal component analysis (PCA), which is useful method to analyze mixed measurement levels data. In the part dealing with the principle, after a brief introduction of ordinary PCA, a PCA for categorical data (nominal and ordinal) is introduced as nonlinear PCA, in which an optimal scaling technique is used to quantify the categorical variables. The alternating least squares (ALS) is the main algorithm in the method. Multiple correspondence analysis (MCA), a special case of nonlinear PCA, is also introduced. All formulations in these methods are integrated in the same manner as matrix operations. Because any measurement levels data can be treated consistently as numerical data and ALS is a very powerful tool for estimations, the methods can be utilized in a variety of fields such as biometrics, econometrics, psychometrics, and sociology. In the applications part of the book, four applications are introduced: variable selection for mixed measurement levels data, sparse MCA, joint dimension reduction and clustering methods for categorical data, and acceleration of ALS computation. The variable selection methods in PCA that originally were developed for numerical data can be applied to any types of measurement levels by using nonlinear PCA. Sparseness and joint dimension reduction and clustering for nonlinear data, the results of recent studies, are extensions obtained by the same matrix operations in nonlinear PCA. Finally, an acceleration algorithm is proposed to reduce the problem of computational cost in the ALS iteration in nonlinear multivariate methods. This book thus presents the usefulness of nonlinear PCA which can be applied to different measurement levels data in diverse fields. As well, it covers the latest topics including the extension of the traditional statistical method, newly proposed nonlinear methods, and computational efficiency in the methods.
Presenting a comprehensive resource for the mastery of network analysis in R, the goal of Network Analysis with R is to introduce modern network analysis techniques in R to social, physical, and health scientists. The mathematical foundations of network analysis are emphasized in an accessible way and readers are guided through the basic steps of network studies: network conceptualization, data collection and management, network description, visualization, and building and testing statistical models of networks. As with all of the books in the Use R! series, each chapter contains extensive R code and detailed visualizations of datasets. Appendices will describe the R network packages and the datasets used in the book. An R package developed specifically for the book, available to readers on GitHub, contains relevant code and real-world network datasets as well.
This book provides a modern introductory tutorial on specialized theoretical aspects of spatial and temporal modeling. The areas covered involve a range of topics which reflect the diversity of this domain of research across a number of quantitative disciplines. For instance, the first chapter provides up-to-date coverage of particle association measures that underpin the theoretical properties of recently developed random set methods in space and time otherwise known as the class of probability hypothesis density framework (PHD filters). The second chapter gives an overview of recent advances in Monte Carlo methods for Bayesian filtering in high-dimensional spaces. In particular, the chapter explains how one may extend classical sequential Monte Carlo methods for filtering and static inference problems to high dimensions and big-data applications. The third chapter presents an overview of generalized families of processes that extend the class of Gaussian process models to heavy-tailed families known as alpha-stable processes. In particular, it covers aspects of characterization via the spectral measure of heavy-tailed distributions and then provides an overview of their applications in wireless communications channel modeling. The final chapter concludes with an overview of analysis for probabilistic spatial percolation methods that are relevant in the modeling of graphical networks and connectivity applications in sensor networks, which also incorporate stochastic geometry features.
This comprehensive and stimulating introduction to Matlab, a computer language now widely used for technical computing, is based on an introductory course held at Qian Weichang College, Shanghai University, in the fall of 2014. Teaching and learning a substantial programming language aren't always straightforward tasks. Accordingly, this textbook is not meant to cover the whole range of this high-performance technical programming environment, but to motivate first- and second-year undergraduate students in mathematics and computer science to learn Matlab by studying representative problems, developing algorithms and programming them in Matlab. While several topics are taken from the field of scientific computing, the main emphasis is on programming. A wealth of examples are completely discussed and solved, allowing students to learn Matlab by doing: by solving problems, comparing approaches and assessing the proposed solutions.
SAS programming is a creative and iterative process designed to empower you to make the most of your organization's data. This friendly guide provides you with a repertoire of essential SAS tools for data management, whether you are a new or an infrequent user. Most useful to students and programmers with little or no SAS experience, it takes a no-frills, hands-on tutorial approach to getting started with the software. You will find immediate guidance in navigating, exploring, visualizing, cleaning, formatting, and reporting on data using SAS and JMP. Step-by-step demonstrations, screenshots, handy tips, and practical exercises with solutions equip you to explore, interpret, process and summarize data independently, efficiently and effectively.
This book presents multivariate time series methods for the analysis and optimal control of feedback systems. Although ships' autopilot systems are considered through the entire book, the methods set forth in this book can be applied to many other complicated, large, or noisy feedback control systems for which it is difficult to derive a model of the entire system based on theory in that subject area. The basic models used in this method are the multivariate autoregressive model with exogenous variables (ARX) model and the radial bases function net-type coefficients ARX model. The noise contribution analysis can then be performed through the estimated autoregressive (AR) model and various types of autopilot systems can be designed through the state-space representation of the models. The marine autopilot systems addressed in this book include optimal controllers for course-keeping motion, rolling reduction controllers with rudder motion, engine governor controllers, noise adaptive autopilots, route-tracking controllers by direct steering, and the reference course-setting approach. The methods presented here are exemplified with real data analysis and experiments on real ships. This book is highly recommended to readers who are interested in designing optimal or adaptive controllers not only of ships but also of any other complicated systems under noisy disturbance conditions.
This open access book contains review papers authored by thirteen plenary invited speakers to the 9th International Congress on Industrial and Applied Mathematics (Valencia, July 15-19, 2019). Written by top-level scientists recognized worldwide, the scientific contributions cover a wide range of cutting-edge topics of industrial and applied mathematics: mathematical modeling, industrial and environmental mathematics, mathematical biology and medicine, reduced-order modeling and cryptography. The book also includes an introductory chapter summarizing the main features of the congress. This is the first volume of a thematic series dedicated to research results presented at ICIAM 2019-Valencia Congress.
MATLAB is a high-level language and environment for numerical computation, visualization, and programming. Using MATLAB, you can analyze data, develop algorithms, and create models and applications. The language, tools, and built-in math functions enable you to explore multiple approaches and reach a solution faster than with spreadsheets or traditional programming languages, such as C/C++ or Java. MATLAB Matrix Algebra introduces you to the MATLAB language with practical hands-on instructions and results, allowing you to quickly achieve your goals. Starting with a look at symbolic and numeric variables, with an emphasis on vector and matrix variables, you will go on to examine functions and operations that support vectors and matrices as arguments, including those based on analytic parent functions. Computational methods for finding eigenvalues and eigenvectors of matrices are detailed, leading to various matrix decompositions. Applications such as change of bases, the classification of quadratic forms and how to solve systems of linear equations are described, with numerous examples. A section is dedicated to sparse matrices and other types of special matrices. In addition to its treatment of matrices, you will also learn how MATLAB can be used to work with arrays, lists, tables, sequences and sets.
MATLAB is a high-level language and environment for numerical computation, visualization, and programming. Using MATLAB, you can analyze data, develop algorithms, and create models and applications. The language, tools, and built-in math functions enable you to explore multiple approaches and reach a solution faster than with spreadsheets or traditional programming languages, such as C/C++ or Java. MATLAB Linear Algebra introduces you to the MATLAB language with practical hands-on instructions and results, allowing you to quickly achieve your goals. In addition to giving an introduction to the MATLAB environment and MATLAB programming, this book provides all the material needed to work in linear algebra with ease. In addition to exploring MATLAB's matrix algebra capabilities, it describes the MATLAB commands that are used to create two- and three-dimensional graphics, including explicit, implicit and parametric curve and surface plotting, and various methods of data representation. Methods for solving systems of equations are detailed.
It's much easier to grasp complex data relationships with a graph than by scanning numbers in a spreadsheet. This introductory guide shows you how to use the R language to create a variety of useful graphs for visualizing and analyzing complex data for science, business, media, and many other fields. You'll learn methods for highlighting important relationships and trends, reducing data to simpler forms, and emphasizing key numbers at a glance. Anyone who wants to analyze data will find something useful here-even if you don't have a background in mathematics, statistics, or computer programming. If you want to examine data related to your work, this book is the ideal way to start. Get started with R by learning basic commands Build single variable graphs, such as dot and pie charts, box plots, and histograms Explore the relationship between two quantitative variables with scatter plots, high-density plots, and other techniques Use scatterplot matrices, 3D plots, clustering, heat maps, and other graphs to visualize relationships among three or more variables
This textbook offers an algorithmic introduction to the field of computer algebra. A leading expert in the field, the author guides readers through numerous hands-on tutorials designed to build practical skills and algorithmic thinking. This implementation-oriented approach equips readers with versatile tools that can be used to enhance studies in mathematical theory, applications, or teaching. Presented using Mathematica code, the book is fully supported by downloadable sessions in Mathematica, Maple, and Maxima. Opening with an introduction to computer algebra systems and the basics of programming mathematical algorithms, the book goes on to explore integer arithmetic. A chapter on modular arithmetic completes the number-theoretic foundations, which are then applied to coding theory and cryptography. From here, the focus shifts to polynomial arithmetic and algebraic numbers, with modern algorithms allowing the efficient factorization of polynomials. The final chapters offer extensions into more advanced topics: simplification and normal forms, power series, summation formulas, and integration. Computer Algebra is an indispensable resource for mathematics and computer science students new to the field. Numerous examples illustrate algorithms and their implementation throughout, with online support materials to encourage hands-on exploration. Prerequisites are minimal, with only a knowledge of calculus and linear algebra assumed. In addition to classroom use, the elementary approach and detailed index make this book an ideal reference for algorithms in computer algebra.
MATLAB is a high-level language and environment for numerical computation, visualization, and programming. Using MATLAB, you can analyze data, develop algorithms, and create models and applications. The language, tools, and built-in math functions enable you to explore multiple approaches and reach a solution faster than with spreadsheets or traditional programming languages, such as C/C++ or Java.MATLAB Control Systems Engineering introduces you to the MATLAB language with practical hands-on instructions and results, allowing you to quickly achieve your goals. In addition to giving an introduction to the MATLAB environment and MATLAB programming, this book provides all the material needed to design and analyze control systems using MATLAB's specialized Control Systems Toolbox. The Control Systems Toolbox offers an extensive range of tools for classical and modern control design. Using these tools you can create models of linear time-invariant systems in transfer function, zero-pole-gain or state space format. You can manipulate both discrete-time and continuous-time systems and convert between various representations. You can calculate and graph time response, frequency response and loci of roots. Other functions allow you to perform pole placement, optimal control and estimates. The Control System Toolbox is open and extendible, allowing you to create customized M-files to suit your specific applications.
The book presents a comprehensive vision of the impact of ICT on the contemporary city, heritage, public spaces and meta-cities on both urban and metropolitan scales, not only in producing innovative perspectives but also related to newly discovered scientific methods, which can be used to stimulate the emerging reciprocal relations between cities and information technologies. Using the principles established by multi-disciplinary interventions as examples and then expanding on them, this book demonstrates how by using ICT and new devices, metropolises can be organized for a future that preserves the historic nucleus of the city and the environment while preparing the necessary expansion of transportation, housing and industrial facilities.
MATLAB is a high-level language and environment for numerical computation, visualization, and programming. Using MATLAB, you can analyze data, develop algorithms, and create models and applications. The language, tools, and built-in math functions enable you to explore multiple approaches and reach a solution faster than with spreadsheets or traditional programming languages, such as C/C++ or Java. MATLAB Differential Equations introduces you to the MATLAB language with practical hands-on instructions and results, allowing you to quickly achieve your goals. In addition to giving an introduction to the MATLAB environment and MATLAB programming, this book provides all the material needed to work on differential equations using MATLAB. It includes techniques for solving ordinary and partial differential equations of various kinds, and systems of such equations, either symbolically or using numerical methods (Euler's method, Heun's method, the Taylor series method, the Runge-Kutta method,...). It also describes how to implement mathematical tools such as the Laplace transform, orthogonal polynomials, and special functions (Airy and Bessel functions), and find solutions of finite difference equations.
Written in a clear and lively tone, Statistics Using IBM SPSS provides a data-centric approach to statistics with integrated SPSS (version 22) commands, ensuring that students gain both a deep conceptual understanding of statistics and practical facility with the leading statistical software package. With one hundred worked examples, the textbook guides students through statistical practice using real data and avoids complicated mathematics. Numerous end-of-chapter exercises allow students to apply and test their understanding of chapter topics, with detailed answers available online. The third edition has been updated throughout and includes a new chapter on research design, new topics (including weighted mean, resampling with the bootstrap, the role of the syntax file in workflow management, and regression to the mean) and new examples and exercises. Student learning is supported by a rich suite of online resources, including answers to end-of-chapter exercises, real data sets, PowerPoint slides, and a test bank.
The matrix laboratory interactive computing environment MATLAB has brought creativity to research in diverse disciplines, particularly in designing and programming experiments. More commonly used in mathematics and the sciences, it also lends itself to a variety of applications across the field of psychology. For the novice looking to use it in experimental psychology research, though, becoming familiar with MATLAB can be a daunting task. "MATLAB for Psychologists"expertly guides readers through the component steps, skills, and operations of the software, with plentiful graphics and examples to match the reader s comfort level. Using an extended illustration, this concise volume explains the program s usefulness at any point in an experiment, without the limits imposed by other types of software. And the authors demonstrate the responsiveness of MATLAB to the individual s research needs, whether the task is programming experiments, creating sensory stimuli, running simulations, or calculating statistics for data analysis. Key features of the coverage: Thinking in a matrix way.Handling and plotting data.Guidelines for improved programming, sound, and imaging.Statistical analysis and signal detection theory indexes.The Graphical User Interface.The Psychophysics Toolbox. "MATLAB for Psychologists"serves a wide audience of advanced undergraduate and graduate level psychology students, professors, and researchers as well as lab technicians involved in programming psychology experiments."
Introduction to Global Optimization Exploiting Space-Filling Curves provides an overview of classical and new results pertaining to the usage of space-filling curves in global optimization. The authors look at a family of derivative-free numerical algorithms applying space-filling curves to reduce the dimensionality of the global optimization problem; along with a number of unconventional ideas, such as adaptive strategies for estimating Lipschitz constant, balancing global and local information to accelerate the search. Convergence conditions of the described algorithms are studied in depth and theoretical considerations are illustrated through numerical examples. This work also contains a code for implementing space-filling curves that can be used for constructing new global optimization algorithms. Basic ideas from this text can be applied to a number of problems including problems with multiextremal and partially defined constraints and non-redundant parallel computations can be organized. Professors, students, researchers, engineers, and other professionals in the fields of pure mathematics, nonlinear sciences studying fractals, operations research, management science, industrial and applied mathematics, computer science, engineering, economics, and the environmental sciences will find this title useful .
The first part of this title contained all statistical tests that are relevant for starters on SPSS, and included standard parametric and non-parametric tests for continuous and binary variables, regression methods, trend tests, and reliability and validity assessments of diagnostic tests. The current part 2 of this title reviews multistep methods, multivariate models, assessments of missing data, performance of diagnostic tests, meta-regression, Poisson regression, confounding and interaction, and survival analyses using log tests and segmented time-dependent Cox regression. Methods for assessing non linear models, data seasonality, distribution free methods, including Monte Carlo methods and artificial intelligence, and robust tests are also covered. Each method of testing is explained using a data example from clinical practice, including every step in SPSS, and a text with interpretations of the results and hints convenient for data reporting. In order to facilitate the use of this cookbook the data files of the examples is made available by the editor through extras.springer.com. Both part 1 and 2 of this title contain a minima amount of text and maximal technical details, but we believe that this property will not refrain students from mastering the SPSS software systematics, and that, instead, it will be a help to that aim. Yet, we recommend that it will used together with the textbook "Statistics Applied to Clinical Trials" (5th edition, Springer, Dordrecht 2012) and the e-books "Statistics on a Pocket Calculator Part 1 and 2 (Springer, Dordrecht, 2011 and 2012) from the same authors. |
![]() ![]() You may like...
|