![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > Computer software packages > Other software packages
This is an introduction to time series that emphasizes methods and analysis of data sets. The logic and tools of model-building for stationary and non-stationary time series are developed and numerous exercises, many of which make use of the included computer package, provide the reader with ample opportunity to develop skills. Statisticians and students will learn the latest methods in time series and forecasting, along with modern computational models and algorithms.
A unique textbook for an undergraduate course on mathematical modeling, Differential Equations with MATLAB: Exploration, Applications, and Theory provides students with an understanding of the practical and theoretical aspects of mathematical models involving ordinary and partial differential equations (ODEs and PDEs). The text presents a unifying picture inherent to the study and analysis of more than 20 distinct models spanning disciplines such as physics, engineering, and finance. The first part of the book presents systems of linear ODEs. The text develops mathematical models from ten disparate fields, including pharmacokinetics, chemistry, classical mechanics, neural networks, physiology, and electrical circuits. Focusing on linear PDEs, the second part covers PDEs that arise in the mathematical modeling of phenomena in ten other areas, including heat conduction, wave propagation, fluid flow through fissured rocks, pattern formation, and financial mathematics. The authors engage students by posing questions of all types throughout, including verifying details, proving conjectures of actual results, analyzing broad strokes that occur within the development of the theory, and applying the theory to specific models. The authors' accessible style encourages students to actively work through the material and answer these questions. In addition, the extensive use of MATLAB (R) GUIs allows students to discover patterns and make conjectures.
Pro SharePoint 2013 Administration is a practical guide to SharePoint 2013 for intermediate to advanced SharePoint administrators and power users, covering the out-of-the-box feature set and capabilities of Microsoft's collaboration and business productivity platform. SharePoint 2013 is an incredibly complex product, with many moving parts, new features, best practices, and "gotchas." Author Rob Garrett distills SharePoint's portfolio of features, capabilities, and utilities into an in-depth professional guide-with no fluff and copious advice-that is designed from scratch to be the manual Microsoft never wrote. Starting with a detailed deployment and initial configuration walkthrough, the book covers all major feature areas, including document management, social computing, metadata management, and administration. You'll also learn about planning for capacity, backup administration and disaster recovery, business intelligence, monitoring, and more. Unlike other books, Pro SharePoint 2013 Administration covers all elements of the product, but has a specific emphasis on features new and enhanced in the 2013 release. Pro SharePoint 2013 Administration is the only book you'll need as you set out to deploy and administer SharePoint 2013.
Microsoft Windows 8.1 and Windows Server 2012 R2 are designed to be the best performing operating systems to date, but even the best systems can be overwhelmed with load and/or plagued with poorly performing code. Windows Performance Analysis Field Guide gives you a practical field guide approach to performance monitoring and analysis from experts who do this work every day. Think of this book as your own guide to "What would Microsoft support do?" when you have a Windows performance issue. Author Clint Huffman, a Microsoft veteran of over fifteen years, shows you how to identify and alleviate problems with the computer resources of disk, memory, processor, and network. You will learn to use performance counters as the initial indicators, then use various tools to "dig in" to the problem, as well as how to capture and analyze boot performance problems.
Nonlinear physics continues to be an area of dynamic modern research, with applications to physics, engineering, chemistry, mathematics, computer science, biology, medicine and economics. In this text extensive use is made of the Mathematica computer algebra system. No prior knowledge of Mathematica or programming is assumed. This book includes 33 experimental activities that are designed to deepen and broaden the reader's understanding of nonlinear physics. These activities are correlated with Part I, the theoretical framework of the text.
This proposed text appears to be a good introduction to evolutionary computation for use in applied statistics research. The authors draw from a vast base of knowledge about the current literature in both the design of evolutionary algorithms and statistical techniques. Modern statistical research is on the threshold of solving increasingly complex problems in high dimensions, and the generalization of its methodology to parameters whose estimators do not follow mathematically simple distributions is underway. Many of these challenges involve optimizing functions for which analytic solutions are infeasible. Evolutionary algorithms represent a powerful and easily understood means of approximating the optimum value in a variety of settings. The proposed text seeks to guide readers through the crucial issues of optimization problems in statistical settings and the implementation of tailored methods (including both stand-alone evolutionary algorithms and hybrid crosses of these procedures with standard statistical algorithms like Metropolis-Hastings) in a variety of applications. This book would serve as an excellent reference work for statistical researchers at an advanced graduate level or beyond, particularly those with a strong background in computer science.
Post-Optimal Analysis in Linear Semi-Infinite Optimization examines the following topics in regards to linear semi-infinite optimization: modeling uncertainty, qualitative stability analysis, quantitative stability analysis and sensitivity analysis. Linear semi-infinite optimization (LSIO) deals with linear optimization problems where the dimension of the decision space or the number of constraints is infinite. The authors compare the post-optimal analysis with alternative approaches to uncertain LSIO problems and provide readers with criteria to choose the best way to model a given uncertain LSIO problem depending on the nature and quality of the data along with the available software. This work also contains open problems which readers will find intriguing a challenging. Post-Optimal Analysis in Linear Semi-Infinite Optimization is aimed toward researchers, graduate and post-graduate students of mathematics interested in optimization, parametric optimization and related topics.
Sampling consists of selection, acquisition, and quantification of a part of the population. While selection and acquisition apply to physical sampling units of the population, quantification pertains only to the variable of interest, which is a particular characteristic of the sampling units. A sampling procedure is expected to provide a sample that is representative with respect to some specified criteria. Composite sampling, under idealized conditions, incurs no loss of information for estimating the population means. But an important limitation to the method has been the loss of information on individual sample values, such as, the extremely large value. In many of the situations where individual sample values are of interest or concern, composite sampling methods can be suitably modified to retrieve the information on individual sample values that may be lost due to compositing. This book presents statistical solutions to issues that arise in the context of applications of composite sampling.
Microsoft SharePoint 2013 provides a collection of tools and services you can use to improve user and team productivity, make information sharing more effective, and facilitate business decision--making processes. In order to get the most out of SharePoint 2013, you need to understand how to best use the capabilities to support your information management, collaboration, and business process management needs. The SharePoint 2013 User's Guide is designed to provide you with the information you need to effectively use these tools. Whether you are using SharePoint as an intranet or business solution platform, you will learn how to use the resources (such as lists, libraries, and sites) and services (such as publishing, workflow, and policies) that make up these environments. In the fourth edition of this bestsellingbook, authors Seth Bates and Tony Smith walk you through the components and capabilities that make up a SharePoint 2013 environment. Their expertise shines as they provide step-by-step instructions for using and managing these elements, as well as recommendations for how to best leverage them.As a reader, you'll then embrace two common SharePoint uses, document management and project information management, and walk through creating samples of these solutions, understanding the challenges these solutions are designed to address and the benefits they can provide. The authors have brought together this information based on their extensive experience working with these tools and with business users who effectively leverage these technologies within their organizations. These real-world practices were incorporated into the writing of this book to make it easy for you to gain the knowledge you need to make the most of the product. Pick up a copy of the SharePoint 2013 User's Guide today.What you'll learn * How to use common SharePoint resources like lists, libraries, and sites * When and how workflows can control the flow and action of content * How to create policies for SharePoint information management and control * The knowledge you need to build and manage intranet and business process solutions * and much more Who this book is for Whether you have not yet used SharePoint at all, have used previous versions, have just started using the basic features, or have been using it for a long of time, this book provides the skills you need to work efficiently with the capabilities SharePoint 2013 provides.
This pocket guide explains the content and the practical use of ISO 21500 - Guidance on project management, the latest international standard for project management, and the first of a family of ISO standards for project, portfolio and program management. ISO 21500 is meant for senior managers and project sponsors to better understand project management and to properly support projects, for project managers and their team members to have a reference for comparing their projects to others and it can be used as a basis for the development of national standards. This pocket guide provides a quick introduction as well as a structured overview of this guidance and deals with the key issues within project management: * Roles and responsibilities * Balancing the project constraints * Competencies of project personnel All ISO 21500 subject groups (themes) are explained: Integration, Stakeholder, Scope, Resource, Time, Cost, Risk, Quality, Procurement and Communication. A separate chapter explains the comparison between, ISO 21500 and PMBOK(R) Guide PRINCE2, Agile, Lean, Six Sigma and other methods, practices and models. Finally, it provides a high level description of how ISO 21500 can be applied in practice using a generic project life cycle. Proper application of this new globally accepted project management guideline will support organizations and individuals in growing their project management maturity consistently to a professional level.
This book presents methods for computing correlation equations. All the topics treated hefe are eluci dated in terms of concrete examples, which have been chosen, for the most part, from the Held of analysis of the mechanical properties of steel, wood, and other materials. A necessary prerequisite for any study of correlation equations is so me knowledge of the moments of random variables. In the Appendix, there is provided a brief treatment of moments, as well as a discussion of the simplest methods of computing them. We have paid particular attention in this book to the techniques of computing correlation equations, and to the use of tables for alleviating the computationalload. The mathematical bases of the methods used in setting up correlation equations are expounded in the books cited at the end of this volume. A. M. December 1965 PIe ase note that the abbreviation 19 is used in this book to designate the logarithm to base ten, Note further that the comma has been retained as the decimal point in tabular material."
This book evolved from lectures, courses and workshops on missing data and small-area estimation that I presented during my tenure as the ?rst C- pion Fellow (2000-2002). For the Fellowship I proposed these two topics as areas in which the academic statistics could contribute to the development of government statistics, in exchange for access to the operational details and background that would inform the direction and sharpen the focus of a- demic research. After a few years of involvement, I have come to realise that the separation of 'academic' and 'industrial' statistics is not well suited to either party, and their integration is the key to progress in both branches. Most of the work on this monograph was done while I was a visiting l- turer at Massey University, Palmerston North, New Zealand. The hospitality and stimulating academic environment of their Institute of Information S- ence and Technology is gratefully acknowledged. I could not name all those who commented on my lecture notes and on the presentations themselves; apart from them, I want to thank the organisers and silent attendees of all the events, and, with a modicum of reluctance, the 'grey ?gures' who kept inquiring whether I was any nearer the completion of whatever stage I had been foolish enough to attach a date.
Looking back at the years that have passed since the realization of the very first electronic, multi-purpose computers, one observes a tremendous growth in hardware and software performance. Today, researchers and engi neers have access to computing power and software that can solve numerical problems which are not fully understood in terms of existing mathemati cal theory. Thus, computational sciences must in many respects be viewed as experimental disciplines. As a consequence, there is a demand for high quality, flexible software that allows, and even encourages, experimentation with alternative numerical strategies and mathematical models. Extensibil ity is then a key issue; the software must provide an efficient environment for incorporation of new methods and models that will be required in fu ture problem scenarios. The development of such kind of flexible software is a challenging and expensive task. One way to achieve these goals is to in vest much work in the design and implementation of generic software tools which can be used in a wide range of application fields. In order to provide a forum where researchers could present and discuss their contributions to the described development, an International Work shop on Modern Software Tools for Scientific Computing was arranged in Oslo, Norway, September 16-18, 1996. This workshop, informally referred to as Sci Tools '96, was a collaboration between SINTEF Applied Mathe matics and the Departments of Informatics and Mathematics at the Uni versity of Oslo."
Separation of signal from noise is the most fundamental problem in data analysis, arising in such fields as: signal processing, econometrics, actuarial science, and geostatistics. This book introduces the local regression method in univariate and multivariate settings, with extensions to local likelihood and density estimation. Practical information is also included on how to implement these methods in the programs S-PLUS and LOCFIT.
This book deals with the performance analysis of closed queueing networks with general processing times and finite buffer spaces. It offers a detailed introduction to the problem and a comprehensive literature review. Two approaches to the performance of closed queueing networks are presented. One is an approximate decomposition approach, while the second is the first exact approach for finite-capacity networks with general processing times. In this Markov chain approach, queueing networks are analyzed by modeling the entire system as one Markov chain. As this approach is exact, it is well-suited both as a reference quantity for approximate procedures and as extension to other queueing networks. Moreover, for the first time, the exact distribution of the time between processing starts is provided.
The advent of fast and sophisticated computer graphics has brought dynamic and interactive images under the control of professional mathematicians and mathematics teachers. This volume in the NATO Special Programme on Advanced Educational Technology takes a comprehensive and critical look at how the computer can support the use of visual images in mathematical problem solving. The contributions are written by researchers and teachers from a variety of disciplines including computer science, mathematics, mathematics education, psychology, and design. Some focus on the use of external visual images and others on the development of individual mental imagery. The book is the first collected volume in a research area that is developing rapidly, and the authors pose some challenging new questions.
Developments in both computer hardware and Perhaps the greatest impact has been felt by the software over the decades have fundamentally education community. Today, it is nearly changed the way people solve problems. impossible to find a college or university that has Technical professionals have greatly benefited not introduced mathematical computation in from new tools and techniques that have allowed some form, into the curriculum. Students now them to be more efficient, accurate, and creative have regular access to the amount of in their work. computational power that were available to a very exclusive set of researchers five years ago. This Maple V and the new generation of mathematical has produced tremendous pedagogical computation systems have the potential of challenges and opportunities. having the same kind of revolutionary impact as high-level general purpose programming Comparisons to the calculator revolution of the languages (e.g. FORTRAN, BASIC, C), 70's are inescapable. Calculators have application software (e.g. spreadsheets, extended the average person's ability to solve Computer Aided Design - CAD), and even common problems more efficiently, and calculators have had. Maple V has amplified our arguably, in better ways. Today, one needs at mathematical abilities: we can solve more least a calculator to deal with standard problems problems more accurately, and more often. In in life -budgets, mortgages, gas mileage, etc. specific disciplines, this amplification has taken For business people or professionals, the excitingly different forms.
Practical SharePoint 2013 Governance is the first book to offer practical and action-focused SharePoint governance guidance based on consulting experiences with real organizations in the field. It provides the quintessential governance reference guide for SharePoint consultants, administrators, architects, and anyone else looking for actual hands-on governance guidance. This book goes beyond filling in a governance document template and focuses entirely on actions to take and behaviors to adopt for addressing real-world governance challenges. Walks you through how to define what SharePoint offers and who is involved Offers key governance strategies for you to adopt or advise to your customers Provides real-world examples that apply each governance concept to an actual scenario
The intensive use of automatic data acquisition system and the use of cloud computing for process monitoring have led to an increased occurrence of industrial processes that utilize statistical process control and capability analysis. These analyses are performed almost exclusively with multivariate methodologies. The aim of this Brief is to present the most important MSQC techniques developed in R language. The book is divided into two parts. The first part contains the basic R elements, an introduction to statistical procedures, and the main aspects related to Statistical Quality Control (SQC). The second part covers the construction of multivariate control charts, the calculation of Multivariate Capability Indices.
Intended for both researchers and practitioners, this book will be a valuable resource for studying and applying recent robust statistical methods. It contains up-to-date research results in the theory of robust statistics Treats computational aspects and algorithms and shows interesting and new applications.
This book presents the statistical analysis of compositional data sets, i.e., data in percentages, proportions, concentrations, etc. The subject is covered from its grounding principles to the practical use in descriptive exploratory analysis, robust linear models and advanced multivariate statistical methods, including zeros and missing values, and paying special attention to data visualization and model display issues. Many illustrated examples and code chunks guide the reader into their modeling and interpretation. And, though the book primarily serves as a reference guide for the R package "compositions," it is also a general introductory text on Compositional Data Analysis. Awareness of their special characteristics spread in the Geosciences in the early sixties, but a strategy for properly dealing with them was not available until the works of Aitchison in the eighties. Since then, research has expanded our understanding of their theoretical principles and the potentials and limitations of their interpretation. This is the first comprehensive textbook addressing these issues, as well as their practical implications with regard to software. The book is intended for scientists interested in statistically analyzing their compositional data. The subject enjoys relatively broad awareness in the geosciences and environmental sciences, but the spectrum of recent applications also covers areas like medicine, official statistics, and economics. Readers should be familiar with basic univariate and multivariate statistics. Knowledge of R is recommended but not required, as the book is self-contained.
This Handbook gives a comprehensive snapshot of a field at the intersection of mathematics and computer science with applications in physics, engineering and education. Reviews 67 software systems and offers 100 pages on applications in physics, mathematics, computer science, engineering chemistry and education.
Automatic Graph Drawing is concerned with the layout of relational structures as they occur in Computer Science (Data Base Design, Data Mining, Web Mining), Bioinformatics (Metabolic Networks), Businessinformatics (Organization Diagrams, Event Driven Process Chains), or the Social Sciences (Social Networks). In mathematical terms, such relational structures are modeled as graphs or more general objects such as hypergraphs, clustered graphs, or compound graphs. A variety of layout algorithms that are based on graph theoretical foundations have been developed in the last two decades and implemented in software systems. After an introduction to the subject area and a concise treatment of the technical foundations for the subsequent chapters, this book features 14 chapters on state-of-the-art graph drawing software systems, ranging from general "tool boxes'' to customized software for various applications. These chapters are written by leading experts, they follow a uniform scheme and can be read independently from each other.
Many of the commonly used methods for modeling and fitting psychophysical data are special cases of statistical procedures of great power and generality, notably the Generalized Linear Model (GLM). This book illustrates how to fit data from a variety of psychophysical paradigms using modern statistical methods and the statistical language R.The paradigms include signal detection theory, psychometric function fitting, classification images and more. In two chapters, recently developed methods for scaling appearance, maximum likelihood difference scaling and maximum likelihood conjoint measurement are examined.The authors also consider the applicationof mixed-effects models to psychophysical data. R is an open-source programming language that is widely used by statisticians and is seeing enormous growth in its application to data in all fields. It is interactive, containing many powerful facilities for optimization, model evaluation, model selection, and graphical display of data. The reader who fits data in R can readily make use of these methods. The researcher who uses R to fit and model his data has access to most recently developed statistical methods. This book does not assume that the reader is familiar with R,
and a little experience with any programming language is all that
is needed to appreciate this book. There are large numbers of
examples of R in the text and the source code for all examples is
available in an R package MPDiR available through R. Laurence T. Maloney is Professor of Psychology and Neural Science at New York University. His research focusses on applications of mathematical models to perception, motor control and decision making."
Every advance in computer architecture and software tempts statisticians to tackle numerically harder problems. To do so intelligently requires a good working knowledge of numerical analysis. This book equips students to craft their own software and to understand the advantages and disadvantages of different numerical methods. Issues of numerical stability, accurate approximation, computational complexity, and mathematical modeling share the limelight in a broad yet rigorous overview of those parts of numerical analysis most relevant to statisticians. In this second edition, the material on optimization has been completely rewritten. There is now an entire chapter on the MM algorithm in addition to more comprehensive treatments of constrained optimization, penalty and barrier methods, and model selection via the lasso. There is also new material on the Cholesky decomposition, Gram-Schmidt orthogonalization, the QR decomposition, the singular value decomposition, and reproducing kernel Hilbert spaces. The discussions of the bootstrap, permutation testing, independent Monte Carlo, and hidden Markov chains are updated, and a new chapter on advanced MCMC topics introduces students to Markov random fields, reversible jump MCMC, and convergence analysis in Gibbs sampling. Numerical Analysis for Statisticians can serve as a graduate text for a course surveying computational statistics. With a careful selection of topics and appropriate supplementation, it can be used at the undergraduate level. It contains enough material for a graduate course on optimization theory. Because many chapters are nearly self-contained, professional statisticians will also find the book useful as a reference. |
![]() ![]() You may like...
Database Systems - Design…
Carlos Coronel, Steven Morris
Paperback
29th European Symposium on Computer…
Anton A Kiss, Edwin Zondervan, …
Hardcover
R12,034
Discovery Miles 120 340
Data Communication and Computer Networks…
Jill West, Curt M. White
Paperback
|