![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Computer software packages > Other software packages
Microsoft Windows 8.1 and Windows Server 2012 R2 are designed to be the best performing operating systems to date, but even the best systems can be overwhelmed with load and/or plagued with poorly performing code. Windows Performance Analysis Field Guide gives you a practical field guide approach to performance monitoring and analysis from experts who do this work every day. Think of this book as your own guide to "What would Microsoft support do?" when you have a Windows performance issue. Author Clint Huffman, a Microsoft veteran of over fifteen years, shows you how to identify and alleviate problems with the computer resources of disk, memory, processor, and network. You will learn to use performance counters as the initial indicators, then use various tools to "dig in" to the problem, as well as how to capture and analyze boot performance problems.
John Chambers turns his attention to R, the enormously successful open-source system based on the S language. His book guides the reader through programming with R, beginning with simple interactive use and progressing by gradual stages, starting with simple functions. More advanced programming techniques can be added as needed, allowing users to grow into software contributors, benefiting their careers and the community. R packages provide a powerful mechanism for contributions to be organized and communicated. This is the only advanced programming book on R, written by the author of the S language from which R evolved.
Choose the Proper Statistical Method for Your Sensory Data Issue Analyzing Sensory Data with R gives you the foundation to analyze and interpret sensory data. The book helps you find the most appropriate statistical method to tackle your sensory data issue. Covering quantitative, qualitative, and affective approaches, the book presents the big picture of sensory evaluation. Through an integrated approach that connects the different dimensions of sensory evaluation, you'll understand: The reasons why sensory data are collected The ways in which the data are collected and analyzed The intrinsic meaning of the data The interpretation of the data analysis results Each chapter corresponds to one main sensory topic. The chapters start with presenting the nature of the sensory evaluation and its objectives, the sensory particularities related to the sensory evaluation, details about the data set obtained, and the statistical analyses required. Using real examples, the authors then illustrate step by step how the analyses are performed in R. The chapters conclude with variants and extensions of the methods that are related to the sensory task itself, the statistical methodology, or both.
About this book * Gives the reader hands on example-base experience for simulating dynamical models in MATLAB (R)/Simulink (R) and animating them in VRML * More than 150 images describe each step in the model realizations helping readers to understand them visually * Diverse examples and profound problem treatment enable the reader to animate complex dynamical problems m-files, Simulink models, VRML files and jpegs available for download provide full solutions for the end-of-chapter problems Virtual Reality and Animation for MATLAB (R) and Simulink (R) Users demonstrates the simulation and animation of physical systems using the MATLAB (R) Virtual Reality Toolbox (virtual models are created in V-Realm Builder). The book is divided into two parts; the first addresses MATLAB (R) and the second Simulink (R). The presentation is problem-based with each chapter teaching the reader a group of essential principles in the context of a step-by-step solution to a particular issue. Examples of the systems covered include mass-spring-dampers, a crank-slider mechanism and a moving vehicle. The examples are given in ascending level of difficulty and contain MATLAB (R)/Simulink (R) codes deliberately simplified so that readers can focus on: * understanding how to link a 3-d virtual scene to MATLAB (R)/Simulink (R); and * manipulating the 3-d virtual scene in MATLAB (R)/Simulink (R). When studied in sequence, the chapters of this text form a coherent whole enabling the reader to gain a thorough expertise in virtual simulation and animation of dynamical models using MATLAB (R)/Simulink (R). Individual chapters stand on their own, however, so that readers interested in a particular system can concentrate on it easily. Problems are provided in each chapter to give practice in the techniques demonstrated and to extend the range of the systems studied, for example, into the control sphere. Solution code for these problems can be downloaded from insert URL. Whether modeling the dynamics of a simple pendulum, a robot arm or a moving car, animation of a dynamical model can enliven and encourage understanding of mechanical systems and thus contribute to control design. Virtual Reality and Animation for MATLAB (R) and Simulink (R) Users will be instructive and interesting to anyone, researcher or student, working with the dynamics of physical systems. Readers are assumed to have some familiarity with MATLAB (R).
Second Edition SAS® PROGRAMMING FOR RESEARCHERS AND SOCIAL SCIENTISTS By PAUL E. SPECTOR, "Just what the novice SAS programmer needs, particularly those who have no real programming experience. For example, branching is one of the more difficult programming commands for students to implement and the author does an excellent job of explaining this topic clearly and at a basic level. A big plus is the Common Errors section since students will definitely encounter errors." ?Robert Pavur, Management Science, University of North Texas The book that won accolades from thousands has been completely revised! Taking a problem solving approach that focuses on common programming tasks that social scientists encounter in doing data analysis, Spector uses sample programs and examples from social science problems to show readers how to write orderly programs and avoid excessive and disorganized branching. He provides readers with a three-step approach (preplanning, writing the program, and debugging) and tips about helpful features and practices as well as how to avoid certain pitfalls. "Spector has done an excellent job in explaining a somewhat difficult topic in a clear and concise manner. I like the fact that screen captures are included. It allows students to better follow what is being described in the book in relation to what is on the screen." ?Philip Craiger, Computer Science, University of Nebraska, Omaha Updated to the latest SAS releases, the book has been thoroughly revised to provide readers with even more practical tips and advice. New features in this edition include: *New sections on debugging in each chapter that provide advice about common errors *End of chapter Debugging Exercises that offer readers the chance to practice spotting the errors in the sample programs *New section in Chapter 1 on how to use the interface, including how to work with three separate windows, where to write the program, executing the program, managing the program files, and using the F key *Five new appendices, including a Glossary of Programming Terms, A Summary of SAS Language Statements, A Summary of SAS PROCs, Information Sources for SAS PROCs, and Corrections for the Debugging Exercises *Plus, a link to Spector's online SAS course! Appropriate for readers with little or no knowledge of the SAS language, this book will enable readers to run each example, adapt the examples to real problems that the reader may have, and create a program. "A solid introduction to programming in SAS, with a good, brief explanation of how that process differs from the usual point-and-click of Windows-based software such as SPSS and a spreadsheet. Even uninformed students can use it as a guide to creating SAS datasets, manipulating them, and writing programs in the SAS language that will produce all manner of statistical results." ?James P. Whittenburg, History, College of William & Mary
"Bridges the gap between programming syntax and programming applications. In contrast to other books on SAS programming, this book combines a clear explanation of the SAS language with a problem-solving approach to writing a SAS program. It provides the novice programmer with a useful and meaningful model for solving the types of programming problems encountered by researchers and social scientists." ?John E. Cornell, Biostatistician, Audie L. Murphy Memorial Hospital
For courses in Graduate MIS, Decision Support Systems, and courses covering the principles of enterprise resource planning systems. This text takes a generic approach to enterprise resource planning systems and their interrelationships, covering all functional areas of this new type of management challenge. It discusses the re-design of business processes, changes in organizational structure, and effective management strategies that will help assure competitiveness, responsiveness, productivity, and global impact for many organizations in the years ahead.
Learn How to Use Growth Curve Analysis with Your Time Course Data An increasingly prominent statistical tool in the behavioral sciences, multilevel regression offers a statistical framework for analyzing longitudinal or time course data. It also provides a way to quantify and analyze individual differences, such as developmental and neuropsychological, in the context of a model of the overall group effects. To harness the practical aspects of this useful tool, behavioral science researchers need a concise, accessible resource that explains how to implement these analysis methods. Growth Curve Analysis and Visualization Using R provides a practical, easy-to-understand guide to carrying out multilevel regression/growth curve analysis (GCA) of time course or longitudinal data in the behavioral sciences, particularly cognitive science, cognitive neuroscience, and psychology. With a minimum of statistical theory and technical jargon, the author focuses on the concrete issue of applying GCA to behavioral science data and individual differences. The book begins with discussing problems encountered when analyzing time course data, how to visualize time course data using the ggplot2 package, and how to format data for GCA and plotting. It then presents a conceptual overview of GCA and the core analysis syntax using the lme4 package and demonstrates how to plot model fits. The book describes how to deal with change over time that is not linear, how to structure random effects, how GCA and regression use categorical predictors, and how to conduct multiple simultaneous comparisons among different levels of a factor. It also compares the advantages and disadvantages of approaches to implementing logistic and quasi-logistic GCA and discusses how to use GCA to analyze individual differences as both fixed and random effects. The final chapter presents the code for all of the key examples along with samples demonstrating how to report GCA results. Throughout the book, R code illustrates how to implement the analyses and generate the graphs. Each chapter ends with exercises to test your understanding. The example datasets, code for solutions to the exercises, and supplemental code and examples are available on the author's website.
Multilevel and Longitudinal Modeling with IBM SPSS, Third Edition, demonstrates how to use the multilevel and longitudinal modeling techniques available in IBM SPSS Versions 25-27. Annotated screenshots with all relevant output provide readers with a step-by-step understanding of each technique as they are shown how to navigate the program. Throughout, diagnostic tools, data management issues, and related graphics are introduced. SPSS commands show the flow of the menu structure and how to facilitate model building, while annotated syntax is also available for those who prefer this approach. Extended examples illustrating the logic of model development and evaluation are included throughout the book, demonstrating the context and rationale of the research questions and the steps around which the analyses are structured. The book opens with the conceptual and methodological issues associated with multilevel and longitudinal modeling, followed by a discussion of SPSS data management techniques that facilitate working with multilevel, longitudinal, or cross-classified data sets. The next few chapters introduce the basics of multilevel modeling, developing a multilevel model, extensions of the basic two-level model (e.g., three-level models, models for binary and ordinal outcomes), and troubleshooting techniques for everyday-use programming and modeling problems along with potential solutions. Models for investigating individual and organizational change are next developed, followed by models with multivariate outcomes and, finally, models with cross-classified and multiple membership data structures. The book concludes with thoughts about ways to expand on the various multilevel and longitudinal modeling techniques introduced and issues (e.g., missing data, sample weights) to keep in mind in conducting multilevel analyses. Key features of the third edition: Thoroughly updated throughout to reflect IBM SPSS Versions 26-27. Introduction to fixed-effects regression for examining change over time where random-effects modeling may not be an optimal choice. Additional treatment of key topics specifically aligned with multilevel modeling (e.g., models with binary and ordinal outcomes). Expanded coverage of models with cross-classified and multiple membership data structures. Added discussion on model checking for improvement (e.g., examining residuals, locating outliers). Further discussion of alternatives for dealing with missing data and the use of sample weights within multilevel data structures. Supported by online data sets, the book's practical approach makes it an essential text for graduate-level courses on multilevel, longitudinal, latent variable modeling, multivariate statistics, or advanced quantitative techniques taught in departments of business, education, health, psychology, and sociology. The book will also prove appealing to researchers in these fields. The book is designed to provide an excellent supplement to Heck and Thomas's An Introduction to Multilevel Modeling Techniques, Fourth Edition; however, it can also be used with any multilevel or longitudinal modeling book or as a stand-alone text.
This book provides a full representation of Inverse Synthetic Aperture Radar (ISAR) imagery, which is a popular and important radar signal processing tool. The book covers all possible aspects of ISAR imaging. The book offers a fair amount of signal processing techniques and radar basics before introducing the inverse problem of ISAR and the forward problem of Synthetic Aperture Radar (SAR). Important concepts of SAR such as resolution, pulse compression and image formation are given together with associated MATLAB codes. After providing the fundamentals for ISAR imaging, the book gives the detailed imaging procedures for ISAR imaging with associated MATLAB functions and codes. To enhance the image quality in ISAR imaging, several imaging tricks and fine-tuning procedures such as zero-padding and windowing are also presented. Finally, various real applications of ISAR imagery, like imaging the antenna-platform scattering, are given in a separate chapter. For all these algorithms, MATLAB codes and figures are included. The final chapter considers advanced concepts and trends in ISAR imaging.
Designed for a graduate course in applied statistics, Nonparametric Methods in Statistics with SAS Applications teaches students how to apply nonparametric techniques to statistical data. It starts with the tests of hypotheses and moves on to regression modeling, time-to-event analysis, density estimation, and resampling methods. The text begins with classical nonparametric hypotheses testing, including the sign, Wilcoxon sign-rank and rank-sum, Ansari-Bradley, Kolmogorov-Smirnov, Friedman rank, Kruskal-Wallis H, Spearman rank correlation coefficient, and Fisher exact tests. It then discusses smoothing techniques (loess and thin-plate splines) for classical nonparametric regression as well as binary logistic and Poisson models. The author also describes time-to-event nonparametric estimation methods, such as the Kaplan-Meier survival curve and Cox proportional hazards model, and presents histogram and kernel density estimation methods. The book concludes with the basics of jackknife and bootstrap interval estimation. Drawing on data sets from the author's many consulting projects, this classroom-tested book includes various examples from psychology, education, clinical trials, and other areas. It also presents a set of exercises at the end of each chapter. All examples and exercises require the use of SAS 9.3 software. Complete SAS codes for all examples are given in the text. Large data sets for the exercises are available on the author's website.
Using the same accessible, hands-on approach as its best-selling predecessor, the Handbook of Univariate and Multivariate Data Analysis with IBM SPSS, Second Edition explains how to apply statistical tests to experimental findings, identify the assumptions underlying the tests, and interpret the findings. This second edition now covers more topics and has been updated with the SPSS statistical package for Windows. New to the Second Edition Three new chapters on multiple discriminant analysis, logistic regression, and canonical correlation New section on how to deal with missing data Coverage of tests of assumptions, such as linearity, outliers, normality, homogeneity of variance-covariance matrices, and multicollinearity Discussions of the calculation of Type I error and the procedure for testing statistical significance between two correlation coefficients obtained from two samples Expanded coverage of factor analysis, path analysis (test of the mediation hypothesis), and structural equation modeling Suitable for both newcomers and seasoned researchers in the social sciences, the handbook offers a clear guide to selecting the right statistical test, executing a wide range of univariate and multivariate statistical tests via the Windows and syntax methods, and interpreting the output results. The SPSS syntax files used for executing the statistical tests can be found in the appendix. Data sets employed in the examples are available on the book's CRC Press web page.
The SPSS Survival Manual throws a lifeline to students and researchers grappling with this powerful data analysis software. In her bestselling guide, Julie Pallant takes you through the entire research process, helping you choose the right data analysis technique for your project. This edition has been updated to include up to SPSS version 26. From the formulation of research questions, to the design of the study and analysis of data, to reporting the results, Julie discusses basic and advanced statistical techniques. She outlines each technique clearly, with step-by-step procedures for performing the analysis, a detailed guide to interpreting data output and an example of how to present the results in a report. For both beginners and experienced users in Psychology, Sociology, Health Sciences, Medicine, Education, Business and related disciplines, the SPSS Survival Manual is an essential text. It is illustrated throughout with screen grabs, examples of output and tips, and is also further supported by a website with sample data and guidelines on report writing. This seventh edition is fully revised and updated to accommodate changes to IBM SPSS procedures.
Basic Statistics provides an accessible and comprehensive introduction to statistics using the free, state-of-the-art, powerful software program R. This book is designed to both introduce students to key concepts in statistics and to provide simple instructions for using R. This concise book: *Teaches essential concepts in statistics, assuming little background knowledge on the part of the reader *Introduces students to R with as few sub-commands as possible for ease of use *Provides practical examples from the educational, behavioral, and social sciences With clear explanations of statistical processes and step-by-step commands in R, Basic Statistics will appeal to students and professionals across the social and behavioral sciences.
Discover how SAP S/4HANA transforms your supply chain! Explore functionalities for sourcing and procurement, production execution, plant maintenance, sales order management, transportation management, warehouse management, and more. See how intelligent technologies elevate your logistics operations with SAP Business Technology Platform and learn about complementary cloud solutions like SAP Ariba and SAP IBP. This is your starting point for logistics with SAP S/4HANA!In this book, you'll learn about: a. Key Functionality See what SAP S/4HANA 2021 has to offer! Walk through your logistics business processes, from production planning to inventory valuation and beyond. Learn about new features such as predictive MRP, centralized procurement, and production engineering and operations. b. Logistics Innovations Your supply chain is getting smarter. Discover intelligent technologies enabled by SAP BTP: blockchain, intelligent robotic process automation, machine learning, and more. c. Planning Your Migration Prepare for your logistics transformation. Plan your roadmap to SAP S/4HANA, evaluate your implementation approaches, and get insight into the new RISE with SAP offering. Highlights include: 1) Planning and scheduling 2) Sourcing and procurement 3) Manufacturing operations 4) Quality management 5) Plant maintenance 6) Sales order management 7) Transportation management 8) Inventory management 9) Warehouse management 10) Intelligent technologies 11) Reporting and analytics 12) Industry use cases
Methods of Statistical Model Estimation examines the most important and popular methods used to estimate parameters for statistical models and provide informative model summary statistics. Designed for R users, the book is also ideal for anyone wanting to better understand the algorithms used for statistical model fitting. The text presents algorithms for the estimation of a variety of regression procedures using maximum likelihood estimation, iteratively reweighted least squares regression, the EM algorithm, and MCMC sampling. Fully developed, working R code is constructed for each method. The book starts with OLS regression and generalized linear models, building to two-parameter maximum likelihood models for both pooled and panel models. It then covers a random effects model estimated using the EM algorithm and concludes with a Bayesian Poisson model using Metropolis-Hastings sampling. The book's coverage is innovative in several ways. First, the authors use executable computer code to present and connect the theoretical content. Therefore, code is written for clarity of exposition rather than stability or speed of execution. Second, the book focuses on the performance of statistical estimation and downplays algebraic niceties. In both senses, this book is written for people who wish to fit statistical models and understand them. See Professor Hilbe discuss the book.
PAMIR (Parameterized Adaptive Multidimensional Integration Routines) is a suite of Fortran programs for multidimensional numerical integration over hypercubes, simplexes, and hyper-rectangles in general dimension p, intended for use by physicists, applied mathematicians, computer scientists, and engineers. The programs, which are available on the internet at www.pamir-integrate.com and are free for non-profit research use, are capable of following localized peaks and valleys of the integrand. Each program comes with a Message-Passing Interface (MPI) parallel version for cluster use as well as serial versions.The first chapter presents introductory material, similar to that on the PAMIR website, and the next is a "manual" giving much more detail on the use of the programs than is on the website. They are followed by many examples of performance benchmarks and comparisons with other programs, and a discussion of the computational integration aspects of PAMIR, in comparison with other methods in the literature. The final chapter provides details of the construction of the algorithms, while the Appendices give technical details and certain mathematical derivations.
PAMIR (Parameterized Adaptive Multidimensional Integration Routines) is a suite of Fortran programs for multidimensional numerical integration over hypercubes, simplexes, and hyper-rectangles in general dimension p, intended for use by physicists, applied mathematicians, computer scientists, and engineers. The programs, which are available on the internet at www.pamir-integrate.com and are free for non-profit research use, are capable of following localized peaks and valleys of the integrand. Each program comes with a Message-Passing Interface (MPI) parallel version for cluster use as well as serial versions.The first chapter presents introductory material, similar to that on the PAMIR website, and the next is a "manual" giving much more detail on the use of the programs than is on the website. They are followed by many examples of performance benchmarks and comparisons with other programs, and a discussion of the computational integration aspects of PAMIR, in comparison with other methods in the literature. The final chapter provides details of the construction of the algorithms, while the Appendices give technical details and certain mathematical derivations.
Enterprise Resource Planning (ERP), Supply Chain Management (SCM), Customer Relationship Management (CRM), Business Intelligence (BI) and Big Data analytics (BDA) are business related tasks and processes, which are supported by standardized software solutions. The book explains that this requires business-oriented thinking and acting from IT specialists and data scientists. It is a good idea to let students experience this directly from the business perspective, for example as executives of a virtual company in a role-playing game. The second edition of the book has been completely revised, restructured and supplemented with actual topics such as blockchains in supply chains and the correlation between Big Data analytics, artificial intelligence and machine learning. The structure of the book is based on the gradual implementation and integration of the respective information systems from the business and management perspectives. Part I contains chapters with detailed descriptions of the topics supplemented by online tests and exercises. Part II introduces role play and the online gaming and simulation environment. Supplementary teaching material, presentations, templates, and video clips are available online in the gaming area. The gaming and business simulation Kdibisglobal.com, newly created for this book, now includes a beer division, a bottled water division, a soft drink division and a manufacturing division for barcode cash register scanner with their specific business processes and supply chains.
Ever-changing business needs have prompted large companies to
rethink their enterprise IT. Today, businesses must allow
interaction with their customers, partners, and employees at more
touch points and at a depth never thought previously. At the same
time, rapid advances in information technologies, like business
digitization, cloud computing, and Web 2.0, demand fundamental
changes in the enterprises management practices. These changes have
a drastic effect not only on IT and business, but also on policies,
processes, and people. Many companies therefore embark on
enterprise-wide transformation initiatives. The role of Enterprise
Architecture (EA) is to architect and supervise this
transformational journey. Unfortunately, today s EA is often a
ponderous and detached exercise, with most of the EA initiatives
failing to create visible impact. The enterprises need an EA that
is agile and responsive to business dynamics. "Collaborative
Enterprise Architecture" provides the innovative solutions today s
enterprises require, informed by real-world experiences and experts
insights. This book, in its first part, provides a systematic
compendium of the current best practices in EA, analyzes current
ways of doing EA, and identifies its constraints and shortcomings.
In the second part, it leaves the beaten tracks of EA by
introducing Lean, Agile, and Enterprise 2.0 concepts to the
traditional EA methods. This blended approach to EA focuses on
practical aspects, with recommendations derived from real-world
experiences. A truly thought provoking and pragmatic guide to
manage EA, "Collaborative Enterprise Architecture" effectively
merges the long-term oriented top-down approach with pragmatic
bottom-up thinking, and that way offers real solutions to
businesses undergoing enterprise-wide change.
These lecture notes provide a rapid, accessible introduction to Bayesian statistical methods. The course covers the fundamental philosophy and principles of Bayesian inference, including the reasoning behind the prior/likelihood model construction synonymous with Bayesian methods, through to advanced topics such as nonparametrics, Gaussian processes and latent factor models. These advanced modelling techniques can easily be applied using computer code samples written in Python and Stan which are integrated into the main text. Importantly, the reader will learn methods for assessing model fit, and to choose between rival modelling approaches.
This work addresses the notion of compression ratios greater than what has been known for random sequential strings in binary and larger radix-based systems as applied to those traditionally found in Kolmogorov complexity. A culmination of the author's decade-long research that began with his discovery of a compressible random sequential string, the book maintains a theoretical-statistical level of introduction suitable for mathematical physicists. It discusses the application of ternary-, quaternary-, and quinary-based systems in statistical communication theory, computing, and physics.
Written with medical statisticians and medical researchers in mind, this intermediate-level reference explores the use of SAS for analyzing medical data. Applied Medical Statistics Using SAS covers the whole range of modern statistical methods used in the analysis of medical data, including regression, analysis of variance and covariance, longitudinal and survival data analysis, missing data, generalized additive models (GAMs), and Bayesian methods. The book focuses on performing these analyses using SAS, the software package of choice for those analysing medical data. Features
Its breadth and depth, coupled with the inclusion of all the SAS code, make this book ideal for practitioners as well as for a graduate class in biostatistics or public health. Complete data sets, all the SAS code, and complete outputs can be found on an associated website: http: //support.sas.com/amsus
Data Analytics for the Social Sciences is an introductory, graduate-level treatment of data analytics for social science. It features applications in the R language, arguably the fastest growing and leading statistical tool for researchers. The book starts with an ethics chapter on the uses and potential abuses of data analytics. Chapters 2 and 3 show how to implement a broad range of statistical procedures in R. Chapters 4 and 5 deal with regression and classification trees and with random forests. Chapter 6 deals with machine learning models and the "caret" package, which makes available to the researcher hundreds of models. Chapter 7 deals with neural network analysis, and Chapter 8 deals with network analysis and visualization of network data. A final chapter treats text analysis, including web scraping, comparative word frequency tables, word clouds, word maps, sentiment analysis, topic analysis, and more. All empirical chapters have two "Quick Start" exercises designed to allow quick immersion in chapter topics, followed by "In Depth" coverage. Data are available for all examples and runnable R code is provided in a "Command Summary". An appendix provides an extended tutorial on R and RStudio. Almost 30 online supplements provide information for the complete book, "books within the book" on a variety of topics, such as agent-based modeling. Rather than focusing on equations, derivations, and proofs, this book emphasizes hands-on obtaining of output for various social science models and how to interpret the output. It is suitable for all advanced level undergraduate and graduate students learning statistical data analysis.
Programming Graphical User Interfaces with R introduces each of the major R packages for GUI programming: RGtk2, qtbase, Tcl/Tk, and gWidgets. With examples woven through the text as well as stand-alone demonstrations of simple yet reasonably complete applications, the book features topics especially relevant to statisticians who aim to provide a practical interface to functionality implemented in R. The book offers: A how-to guide for developing GUIs within R The fundamentals for users with limited knowledge of programming within R and other languages GUI design for specific functions or as learning tools The accompanying package, ProgGUIinR, includes the complete code for all examples as well as functions for browsing the examples from the respective chapters. Accessible to seasoned, novice, and occasional R users, this book shows that for many purposes, adding a graphical interface to one's work is not terribly sophisticated or time consuming.
Get to grips with Sage One in simple steps. Sage One For Dummies explains every aspect of setting up and navigating Sage One, the newest accounting solution for small businesses and sole traders. It includes clear instructions for using Sage One Accounts including setting up customer and supplier records, creating invoices, paying customers and suppliers, bank reconciliation, VAT returns and reporting. It also explains how to use the Cashbook function (if your business is more cash-based) and how to work with your accountant using the Accountant Edition. Packed with step-by-step instructions and fully illustrated with screenshots, this book is the easiest way to get the most from Sage One and take control of your business finances. * Shows readers how to set up, install and navigate using dummy data * Features setting up customer & supplier records * Details how to create invoices for customers and suppliers * Enables the reader to produce their own reports |
You may like...
Management and Engineering of Critical…
Bedir Tekinerdogan, Mehmet Aksit, …
Paperback
R2,954
Discovery Miles 29 540
29th European Symposium on Computer…
Anton A Kiss, Edwin Zondervan, …
Hardcover
R11,317
Discovery Miles 113 170
An Introduction to Creating Standardized…
Todd Case, Yuting Tian
Hardcover
R1,501
Discovery Miles 15 010
Data Communication and Computer Networks…
Jill West, Curt M. White
Paperback
Database Systems - Design…
Carlos Coronel, Steven Morris
Paperback
|