![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Computer software packages > Other software packages
This small book addresses different kinds of datafiles, as commonly encountered in clinical research, and their data-analysis on SPSS Software. Some 15 years ago serious statistical analyses were conducted by specialist statisticians using ma- frame computers. Nowadays, there is ready access to statistical computing using personal computers or laptops, and this practice has changed boundaries between basic statistical methods that can be conveniently carried out on a pocket calculator and more advanced statistical methods that can only be executed on a computer. Clinical researchers currently perform basic statistics without professional help from a statistician, including t-tests and chi-square tests. With help of user-friendly software the step from such basic tests to more complex tests has become smaller, and more easy to take. It is our experience as masters' and doctorate class teachers of the European College of Pharmaceutical Medicine (EC Socrates Project Lyon France) that s- dents are eager to master adequate command of statistical software for that purpose. However, doing so, albeit easy, still takes 20-50 steps from logging in to the final result, and all of these steps have to be learned in order for the procedures to be successful.
This is an introduction to time series that emphasizes methods and analysis of data sets. The logic and tools of model-building for stationary and non-stationary time series are developed and numerous exercises, many of which make use of the included computer package, provide the reader with ample opportunity to develop skills. Statisticians and students will learn the latest methods in time series and forecasting, along with modern computational models and algorithms.
Post-Optimal Analysis in Linear Semi-Infinite Optimization examines the following topics in regards to linear semi-infinite optimization: modeling uncertainty, qualitative stability analysis, quantitative stability analysis and sensitivity analysis. Linear semi-infinite optimization (LSIO) deals with linear optimization problems where the dimension of the decision space or the number of constraints is infinite. The authors compare the post-optimal analysis with alternative approaches to uncertain LSIO problems and provide readers with criteria to choose the best way to model a given uncertain LSIO problem depending on the nature and quality of the data along with the available software. This work also contains open problems which readers will find intriguing a challenging. Post-Optimal Analysis in Linear Semi-Infinite Optimization is aimed toward researchers, graduate and post-graduate students of mathematics interested in optimization, parametric optimization and related topics.
Sampling consists of selection, acquisition, and quantification of a part of the population. While selection and acquisition apply to physical sampling units of the population, quantification pertains only to the variable of interest, which is a particular characteristic of the sampling units. A sampling procedure is expected to provide a sample that is representative with respect to some specified criteria. Composite sampling, under idealized conditions, incurs no loss of information for estimating the population means. But an important limitation to the method has been the loss of information on individual sample values, such as, the extremely large value. In many of the situations where individual sample values are of interest or concern, composite sampling methods can be suitably modified to retrieve the information on individual sample values that may be lost due to compositing. This book presents statistical solutions to issues that arise in the context of applications of composite sampling.
This book presents methods for computing correlation equations. All the topics treated hefe are eluci dated in terms of concrete examples, which have been chosen, for the most part, from the Held of analysis of the mechanical properties of steel, wood, and other materials. A necessary prerequisite for any study of correlation equations is so me knowledge of the moments of random variables. In the Appendix, there is provided a brief treatment of moments, as well as a discussion of the simplest methods of computing them. We have paid particular attention in this book to the techniques of computing correlation equations, and to the use of tables for alleviating the computationalload. The mathematical bases of the methods used in setting up correlation equations are expounded in the books cited at the end of this volume. A. M. December 1965 PIe ase note that the abbreviation 19 is used in this book to designate the logarithm to base ten, Note further that the comma has been retained as the decimal point in tabular material."
This proposed text appears to be a good introduction to evolutionary computation for use in applied statistics research. The authors draw from a vast base of knowledge about the current literature in both the design of evolutionary algorithms and statistical techniques. Modern statistical research is on the threshold of solving increasingly complex problems in high dimensions, and the generalization of its methodology to parameters whose estimators do not follow mathematically simple distributions is underway. Many of these challenges involve optimizing functions for which analytic solutions are infeasible. Evolutionary algorithms represent a powerful and easily understood means of approximating the optimum value in a variety of settings. The proposed text seeks to guide readers through the crucial issues of optimization problems in statistical settings and the implementation of tailored methods (including both stand-alone evolutionary algorithms and hybrid crosses of these procedures with standard statistical algorithms like Metropolis-Hastings) in a variety of applications. This book would serve as an excellent reference work for statistical researchers at an advanced graduate level or beyond, particularly those with a strong background in computer science.
The first MATLAB-based numerical methods textbook for bioengineers that uniquely integrates modelling concepts with statistical analysis, while maintaining a focus on enabling the user to report the error or uncertainty in their result. Between traditional numerical method topics of linear modelling concepts, nonlinear root finding, and numerical integration, chapters on hypothesis testing, data regression and probability are interweaved. A unique feature of the book is the inclusion of examples from clinical trials and bioinformatics, which are not found in other numerical methods textbooks for engineers. With a wealth of biomedical engineering examples, case studies on topical biomedical research, and the inclusion of end of chapter problems, this is a perfect core text for a one-semester undergraduate course.
Intended for both researchers and practitioners, this book will be a valuable resource for studying and applying recent robust statistical methods. It contains up-to-date research results in the theory of robust statistics Treats computational aspects and algorithms and shows interesting and new applications.
Looking back at the years that have passed since the realization of the very first electronic, multi-purpose computers, one observes a tremendous growth in hardware and software performance. Today, researchers and engi neers have access to computing power and software that can solve numerical problems which are not fully understood in terms of existing mathemati cal theory. Thus, computational sciences must in many respects be viewed as experimental disciplines. As a consequence, there is a demand for high quality, flexible software that allows, and even encourages, experimentation with alternative numerical strategies and mathematical models. Extensibil ity is then a key issue; the software must provide an efficient environment for incorporation of new methods and models that will be required in fu ture problem scenarios. The development of such kind of flexible software is a challenging and expensive task. One way to achieve these goals is to in vest much work in the design and implementation of generic software tools which can be used in a wide range of application fields. In order to provide a forum where researchers could present and discuss their contributions to the described development, an International Work shop on Modern Software Tools for Scientific Computing was arranged in Oslo, Norway, September 16-18, 1996. This workshop, informally referred to as Sci Tools '96, was a collaboration between SINTEF Applied Mathe matics and the Departments of Informatics and Mathematics at the Uni versity of Oslo."
Separation of signal from noise is the most fundamental problem in data analysis, arising in such fields as: signal processing, econometrics, actuarial science, and geostatistics. This book introduces the local regression method in univariate and multivariate settings, with extensions to local likelihood and density estimation. Practical information is also included on how to implement these methods in the programs S-PLUS and LOCFIT.
The advent of fast and sophisticated computer graphics has brought dynamic and interactive images under the control of professional mathematicians and mathematics teachers. This volume in the NATO Special Programme on Advanced Educational Technology takes a comprehensive and critical look at how the computer can support the use of visual images in mathematical problem solving. The contributions are written by researchers and teachers from a variety of disciplines including computer science, mathematics, mathematics education, psychology, and design. Some focus on the use of external visual images and others on the development of individual mental imagery. The book is the first collected volume in a research area that is developing rapidly, and the authors pose some challenging new questions.
World Wide Web is becoming an utility, not unlike electricity or running water in our homes. This creates new ways of using the web, where Social Media plays a particular role. This gives an unprecedented opportunity to study the emerging social phenomena in the virtual world. In addition, it opens new avenues for improving public services such as schooling and education. This book includes some of the latest developments in employing the information and communications technologies for examining both virtual and real-life social interactions. Investigating modern challenges such as online education, web security or organized cybercrime, this book outlines the state of the art in social applications and implications of ICT.
Dealing with methods for sampling from posterior distributions and how to compute posterior quantities of interest using Markov chain Monte Carlo (MCMC) samples, this book addresses such topics as improving simulation accuracy, marginal posterior density estimation, estimation of normalizing constants, constrained parameter problems, highest posterior density interval calculations, computation of posterior modes, and posterior computations for proportional hazards models and Dirichlet process models. The authors also discuss model comparisons, including both nested and non-nested models, marginal likelihood methods, ratios of normalizing constants, Bayes factors, the Savage-Dickey density ratio, Stochastic Search Variable Selection, Bayesian Model Averaging, the reverse jump algorithm, and model adequacy using predictive and latent residual approaches. The book presents an equal mixture of theory and applications involving real data, and is intended as a graduate textbook or a reference book for a one-semester course at the advanced masters or Ph.D. level. It will also serve as a useful reference for applied or theoretical researchers as well as practitioners.
The intensive use of automatic data acquisition system and the use of cloud computing for process monitoring have led to an increased occurrence of industrial processes that utilize statistical process control and capability analysis. These analyses are performed almost exclusively with multivariate methodologies. The aim of this Brief is to present the most important MSQC techniques developed in R language. The book is divided into two parts. The first part contains the basic R elements, an introduction to statistical procedures, and the main aspects related to Statistical Quality Control (SQC). The second part covers the construction of multivariate control charts, the calculation of Multivariate Capability Indices.
Every advance in computer architecture and software tempts statisticians to tackle numerically harder problems. To do so intelligently requires a good working knowledge of numerical analysis. This book equips students to craft their own software and to understand the advantages and disadvantages of different numerical methods. Issues of numerical stability, accurate approximation, computational complexity, and mathematical modeling share the limelight in a broad yet rigorous overview of those parts of numerical analysis most relevant to statisticians. In this second edition, the material on optimization has been completely rewritten. There is now an entire chapter on the MM algorithm in addition to more comprehensive treatments of constrained optimization, penalty and barrier methods, and model selection via the lasso. There is also new material on the Cholesky decomposition, Gram-Schmidt orthogonalization, the QR decomposition, the singular value decomposition, and reproducing kernel Hilbert spaces. The discussions of the bootstrap, permutation testing, independent Monte Carlo, and hidden Markov chains are updated, and a new chapter on advanced MCMC topics introduces students to Markov random fields, reversible jump MCMC, and convergence analysis in Gibbs sampling. Numerical Analysis for Statisticians can serve as a graduate text for a course surveying computational statistics. With a careful selection of topics and appropriate supplementation, it can be used at the undergraduate level. It contains enough material for a graduate course on optimization theory. Because many chapters are nearly self-contained, professional statisticians will also find the book useful as a reference.
Biological and biomedical studies have entered a new era over the past two decades thanks to the wide use of mathematical models and computational approaches. A booming of computational biology, which sheerly was a theoretician's fantasy twenty years ago, has become a reality. Obsession with computational biology and theoretical approaches is evidenced in articles hailing the arrival of what are va- ously called quantitative biology, bioinformatics, theoretical biology, and systems biology. New technologies and data resources in genetics, such as the International HapMap project, enable large-scale studies, such as genome-wide association st- ies, which could potentially identify most common genetic variants as well as rare variants of the human DNA that may alter individual's susceptibility to disease and the response to medical treatment. Meanwhile the multi-electrode recording from behaving animals makes it feasible to control the animal mental activity, which could potentially lead to the development of useful brain-machine interfaces. - bracing the sheer volume of genetic, genomic, and other type of data, an essential approach is, ?rst of all, to avoid drowning the true signal in the data. It has been witnessed that theoretical approach to biology has emerged as a powerful and st- ulating research paradigm in biological studies, which in turn leads to a new - search paradigm in mathematics, physics, and computer science and moves forward with the interplays among experimental studies and outcomes, simulation studies, and theoretical investigations.
Accurate and efficient computer algorithms for factoring matrices, solving linear systems of equations, and extracting eigenvalues and eigenvectors. Regardless of the software system used, the book describes and gives examples of the use of modern computer software for numerical linear algebra. It begins with a discussion of the basics of numerical computations, and then describes the relevant properties of matrix inverses, factorisations, matrix and vector norms, and other topics in linear algebra. The book is essentially self- contained, with the topics addressed constituting the essential material for an introductory course in statistical computing. Numerous exercises allow the text to be used for a first course in statistical computing or as supplementary text for various courses that emphasise computations.
A unique textbook for an undergraduate course on mathematical modeling, Differential Equations with MATLAB: Exploration, Applications, and Theory provides students with an understanding of the practical and theoretical aspects of mathematical models involving ordinary and partial differential equations (ODEs and PDEs). The text presents a unifying picture inherent to the study and analysis of more than 20 distinct models spanning disciplines such as physics, engineering, and finance. The first part of the book presents systems of linear ODEs. The text develops mathematical models from ten disparate fields, including pharmacokinetics, chemistry, classical mechanics, neural networks, physiology, and electrical circuits. Focusing on linear PDEs, the second part covers PDEs that arise in the mathematical modeling of phenomena in ten other areas, including heat conduction, wave propagation, fluid flow through fissured rocks, pattern formation, and financial mathematics. The authors engage students by posing questions of all types throughout, including verifying details, proving conjectures of actual results, analyzing broad strokes that occur within the development of the theory, and applying the theory to specific models. The authors' accessible style encourages students to actively work through the material and answer these questions. In addition, the extensive use of MATLAB (R) GUIs allows students to discover patterns and make conjectures.
Many of the commonly used methods for modeling and fitting psychophysical data are special cases of statistical procedures of great power and generality, notably the Generalized Linear Model (GLM). This book illustrates how to fit data from a variety of psychophysical paradigms using modern statistical methods and the statistical language R.The paradigms include signal detection theory, psychometric function fitting, classification images and more. In two chapters, recently developed methods for scaling appearance, maximum likelihood difference scaling and maximum likelihood conjoint measurement are examined.The authors also consider the applicationof mixed-effects models to psychophysical data. R is an open-source programming language that is widely used by statisticians and is seeing enormous growth in its application to data in all fields. It is interactive, containing many powerful facilities for optimization, model evaluation, model selection, and graphical display of data. The reader who fits data in R can readily make use of these methods. The researcher who uses R to fit and model his data has access to most recently developed statistical methods. This book does not assume that the reader is familiar with R,
and a little experience with any programming language is all that
is needed to appreciate this book. There are large numbers of
examples of R in the text and the source code for all examples is
available in an R package MPDiR available through R. Laurence T. Maloney is Professor of Psychology and Neural Science at New York University. His research focusses on applications of mathematical models to perception, motor control and decision making."
Bayesian Networks in R with Applications in Systems Biology is unique as it introduces the reader to the essential concepts in Bayesian network modeling and inference in conjunction with examples in the open-source statistical environment R. The level of sophistication is also gradually increased across the chapters with exercises and solutions for enhanced understanding for hands-on experimentation of the theory and concepts. The application focuses on systems biology with emphasis on modeling pathways and signaling mechanisms from high-throughput molecular data. Bayesian networks have proven to be especially useful abstractions in this regard. Their usefulness is especially exemplified by their ability to discover new associations in addition to validating known ones across the molecules of interest. It is also expected that the prevalence of publicly available high-throughput biological data sets may encourage the audience to explore investigating novel paradigms using the approaches presented in the book.
Recent advances in the understanding of star formation and evolution have been impressive and aspects of that knowledge are explored in this volume. The black hole stellar endpoints are studied and geodesic motion is explored. The emission of gravitational waves is featured due to their very recent experimental discovery.The second aspect of the text is space exploration which began 62 years ago with the Sputnik Earth satellite followed by the landing on the Moon just 50 years ago. Since then Mars has been explored remotely as well as flybys of the outer planets and probes which have escaped the solar system. The text explores many aspects of rocket travel. Finally possibilities for interstellar travel are discussed.All these topics are treated in a unified way using the Matlab App to combine text, figures, formulae and numeric input and output. In this way the reader may vary parameters and see the results in real time. That experience aids in building up an intuitive feel for the many specific problems given in this text.
This pocket guide explains the content and the practical use of ISO 21500 - Guidance on project management, the latest international standard for project management, and the first of a family of ISO standards for project, portfolio and program management. ISO 21500 is meant for senior managers and project sponsors to better understand project management and to properly support projects, for project managers and their team members to have a reference for comparing their projects to others and it can be used as a basis for the development of national standards. This pocket guide provides a quick introduction as well as a structured overview of this guidance and deals with the key issues within project management: * Roles and responsibilities * Balancing the project constraints * Competencies of project personnel All ISO 21500 subject groups (themes) are explained: Integration, Stakeholder, Scope, Resource, Time, Cost, Risk, Quality, Procurement and Communication. A separate chapter explains the comparison between, ISO 21500 and PMBOK(R) Guide PRINCE2, Agile, Lean, Six Sigma and other methods, practices and models. Finally, it provides a high level description of how ISO 21500 can be applied in practice using a generic project life cycle. Proper application of this new globally accepted project management guideline will support organizations and individuals in growing their project management maturity consistently to a professional level.
This third edition of Braun and Murdoch's bestselling textbook now includes discussion of the use and design principles of the tidyverse packages in R, including expanded coverage of ggplot2, and R Markdown. The expanded simulation chapter introduces the Box-Muller and Metropolis-Hastings algorithms. New examples and exercises have been added throughout. This is the only introduction you'll need to start programming in R, the computing standard for analyzing data. This book comes with real R code that teaches the standards of the language. Unlike other introductory books on the R system, this book emphasizes portable programming skills that apply to most computing languages and techniques used to develop more complex projects. Solutions, datasets, and any errata are available from www.statprogr.science. Worked examples - from real applications - hundreds of exercises, and downloadable code, datasets, and solutions make a complete package for anyone working in or learning practical data science.
This book was first published in 2003. Combinatorica, an extension to the popular computer algebra system Mathematica (R), is the most comprehensive software available for teaching and research applications of discrete mathematics, particularly combinatorics and graph theory. This book is the definitive reference/user's guide to Combinatorica, with examples of all 450 Combinatorica functions in action, along with the associated mathematical and algorithmic theory. The authors cover classical and advanced topics on the most important combinatorial objects: permutations, subsets, partitions, and Young tableaux, as well as all important areas of graph theory: graph construction operations, invariants, embeddings, and algorithmic graph theory. In addition to being a research tool, Combinatorica makes discrete mathematics accessible in new and exciting ways to a wide variety of people, by encouraging computational experimentation and visualization. The book contains no formal proofs, but enough discussion to understand and appreciate all the algorithms and theorems it contains.
"MATLAB for Neuroscientists" serves as the only complete study manual and teaching resource for MATLAB, the globally accepted standard for scientific computing, in the neurosciences and psychology. This unique introduction can be used to learn the entire empirical and experimental process (including stimulus generation, experimental control, data collection, data analysis, modeling, and more), and the 2nd Edition continues to ensure that a wide variety of computational problems can be addressed in a single programming environment. This updated edition features additional material on the
creation of visual stimuli, advanced psychophysics, analysis of LFP
data, choice probabilities, synchrony, and advanced spectral
analysis. Users at a variety of levels-advanced undergraduates,
beginning graduate students, and researchers looking to modernize
their skills-will learn to design and implement their own
analytical tools, and gain the fluency required to meet the
computational needs of neuroscience practitioners. |
You may like...
Broadband Satellite Communication…
Thierry Gayraud, Michel Mazella, …
Hardcover
R2,795
Discovery Miles 27 950
Netspionage - The Global Threat to…
William C. Boni, Gerald Kovacich
Paperback
R1,524
Discovery Miles 15 240
RNA Methodologies - A Laboratory Guide…
Robert E. Farrell Jr
Paperback
R3,455
Discovery Miles 34 550
|