![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Computer software packages
Microsoft Excel By: The simplest and quickest guide to operating Excel's complex system! Need to learn Excel quick and easy? Want to know the power of Excel spreadsheets? Tips and secrets revealed of the Microsoft giants program! Shortcuts, quick entries and anything else to make Excel an easy application Want to skip an entire semester or program learning excel? Everything from navigation, performing commands to formatting! Want a better understanding of excels power and functionality of formulas?How about charts or complex spreadsheets? One CLICK and that all gets answered so PURCHASE NOW!
Provides a synopsis of the various technologies in perceptual-based multimedia design.
The methods considered in the 7th conference on "Finite Volumes for Complex Applications" (Berlin, June 2014) have properties which offer distinct advantages for a number of applications. The second volume of the proceedings covers reviewed contributions reporting successful applications in the fields of fluid dynamics, magnetohydrodynamics, structural analysis, nuclear physics, semiconductor theory and other topics. The finite volume method in its various forms is a space discretization technique for partial differential equations based on the fundamental physical principle of conservation. Recent decades have brought significant success in the theoretical understanding of the method. Many finite volume methods preserve further qualitative or asymptotic properties, including maximum principles, dissipativity, monotone decay of free energy, and asymptotic stability. Due to these properties, finite volume methods belong to the wider class of compatible discretization methods, which preserve qualitative properties of continuous problems at the discrete level. This structural approach to the discretization of partial differential equations becomes particularly important for multiphysics and multiscale applications. Researchers, PhD and masters level students in numerical analysis, scientific computing and related fields such as partial differential equations will find this volume useful, as will engineers working in numerical modeling and simulations.
This book is designed as a gentle introduction to the fascinating field of choice modeling and its practical implementation using the R language. Discrete choice analysis is a family of methods useful to study individual decision-making. With strong theoretical foundations in consumer behavior, discrete choice models are used in the analysis of health policy, transportation systems, marketing, economics, public policy, political science, urban planning, and criminology, to mention just a few fields of application. The book does not assume prior knowledge of discrete choice analysis or R, but instead strives to introduce both in an intuitive way, starting from simple concepts and progressing to more sophisticated ideas. Loaded with a wealth of examples and code, the book covers the fundamentals of data and analysis in a progressive way. Readers begin with simple data operations and the underlying theory of choice analysis and conclude by working with sophisticated models including latent class logit models, mixed logit models, and ordinal logit models with taste heterogeneity. Data visualization is emphasized to explore both the input data as well as the results of models. This book should be of interest to graduate students, faculty, and researchers conducting empirical work using individual level choice data who are approaching the field of discrete choice analysis for the first time. In addition, it should interest more advanced modelers wishing to learn about the potential of R for discrete choice analysis. By embedding the treatment of choice modeling within the R ecosystem, readers benefit from learning about the larger R family of packages for data exploration, analysis, and visualization.
The papers in this volume represent the most timely and advanced contributions to the 2014 Joint Applied Statistics Symposium of the International Chinese Statistical Association (ICSA) and the Korean International Statistical Society (KISS), held in Portland, Oregon. The contributions cover new developments in statistical modeling and clinical research: including model development, model checking, and innovative clinical trial design and analysis. Each paper was peer-reviewed by at least two referees and also by an editor. The conference was attended by over 400 participants from academia, industry, and government agencies around the world, including from North America, Asia, and Europe. It offered 3 keynote speeches, 7 short courses, 76 parallel scientific sessions, student paper sessions, and social events.
Over the past decade the field of Human-Computer Interaction has evolved from the study of the usability of interactive products towards a more holistic understanding of how they may mediate desired human experiences. This book identifies the notion of diversity in users' experiences with interactive products and proposes methods and tools for modeling this along two levels: (a) interpersonal diversity in users' responses to early conceptual designs, and (b) the dynamics of users' experiences over time. The Repertory Grid Technique is proposed as an alternative to standardized psychometric scales for modeling interpersonal diversity in users' responses to early concepts in the design process, and new Multi-Dimensional Scaling procedures are introduced for modeling such complex quantitative data. iScale, a tool for the retrospective assessment of users' experiences over time is proposed as an alternative to longitudinal field studies, and a semi-automated technique for the analysis of the elicited experience narratives is introduced. Through these two methodological contributions, this book argues against averaging in the subjective evaluation of interactive products. It proposes the development of interactive tools that can assist designers in moving across multiple levels of abstraction of empirical data, as design-relevant knowledge might be found on all these levels. Foreword by Jean-Bernard Martens and Closing Note by Marc Hassenzahl.
Organizations and enterprises are becoming more competitive about enlarging their target markets and developing innovative products and services, all the while transforming their business models to better fit the new technological era. Enterprise Interoperability (EI) is an emerging and thriving research domain, but a lack of standardized guidelines inhibit its complete reuse. Revolutionizing Enterprise Interoperability through Scientific Foundations offers information on the latest advancements and research for Enterprise Interoperability knowledge as well as core concepts, theories, and future directions. This book is an essential resource for researchers and practitioners in the Enterprise Interoperability field and related areas.
The "Handbook of Simulation Optimization" presents an overview of the state of the art of simulation optimization, providing a survey of the most well-established approaches for optimizing stochastic simulation models and a sampling of recent research advances in theory and methodology. Leading contributors cover such topics as discrete optimization via simulation, ranking and selection, efficient simulation budget allocation, random search methods, response surface methodology, stochastic gradient estimation, stochastic approximation, sample average approximation, stochastic constraints, variance reduction techniques, model-based stochastic search methods and Markov decision processes. This single volume should serve as a reference for those already in the field and as a means for those new to the field for understanding and applying the main approaches. The intended audience includes researchers, practitioners and graduate students in the business/engineering fields of operations research, management science, operations management and stochastic control, as well as in economics/finance and computer science.
Eye tracking is a widely used research method, but there are many questions and misconceptions about how to effectively apply it. Eye Tracking the User Experience--the first how-to book about eye tracking for UX practitioners--offers step-by-step advice on how to plan, prepare, and conduct eye tracking studies; how to analyze and interpret eye movement data; and how to successfully communicate eye tracking findings.
This book constitutes the refereed post-proceedings of the 7th IFIP WG 5.5 International Precision Assembly Seminar, IPAS 2014, held in Chamonix, France, in February 2014. The 20 revised full papers were carefully reviewed and selected from numerous submissions. The papers cover the following topics: micro-assembly processes and systems ranging from desktop factory automation and packaging of MEMS to self-assembly processes and platforms; handling and manipulation, including flexible gripper systems, fixturing and high precision actuators; tolerance management and error-compensation techniques applied at different scales of precision assembly; metrology and quality control; intelligent assembly control; process selection, modelling and planning.
The book presents a snapshot of the state-of-art in the field of turbulence modeling and covers the latest developments concerning direct numerical simulations, large eddy simulations, compressible turbulence, coherent structures, two-phase flow simulation and other related topics. It provides readers with a comprehensive review of both theory and applications, describing in detail the authors own experimental results. The book is based on the proceedings of the third Turbulence and Interactions Conference (TI 2012), which was held on June 11-14 in La Saline-les-Bains, La Reunion, France and includes both keynote lectures and outstanding contributed papers presented at the conference. This multifaceted collection, which reflects the conferences emphasis on the interplay of theory, experiments and computing in the process of understanding and predicting the physics of complex flows and solving related engineering problems, offers a practice-oriented guide for students, researchers and professionals in the field of computational fluid dynamics, turbulence modeling and related areas. "
"This volume provides essential guidance for transforming
mathematics learning in schools through the use of innovative
technology, pedagogy, and curriculum. It presents clear, rigorous
evidence of the impact technology can have in improving students
learning of important yet complex mathematical concepts -- and goes
beyond a focus on technology alone to clearly explain how teacher
professional development, pedagogy, curriculum, and student
participation and identity each play an essential role in
transforming mathematics classrooms with technology. Further,
evidence of effectiveness is complemented by insightful case
studies of how key factors lead to enhancing learning, including
the contributions of design research, classroom discourse, and
meaningful assessment. "* Engaging students in deeply learning the important concepts
in mathematics "* Engaging students in deeply learning the important concepts
in mathematics
The first comprehensive guide to explore the growing field of electronic information, The Text in the Machine: Electronic Texts in the Humanities will help you create and use electronic texts. This book explains the processes involved in developing computerized books on library Web sites, CD-ROMs, or your own Web site. With the information provided by The Text in the Machine, you?ll be able to successfully transfer written words to a digitized form and increase access to any kind of information. Keeping the perspectives of scholars, students, librarians, users, and publishers in mind, this book outlines the necessary steps for electronic conversion in a comprehensive manner. The Text in the Machine addresses many variables that need to be taken into consideration to help you digitize texts, such as: defining types of markup, markup systems, and their uses identifying characteristics of the written text, such as its linguistic and physical nature, before choosing a markup scheme ensuring accuracy in electronic texts by keying in information up to three times and choosing software that is compatible with the markup systems you are using examining the best file formats for scanning written texts and converting them to digital form explaining the delivery systems available for electronic texts, such as CD-ROMs, the Internet, magnetic tape, and the variety of software that will interpret these interfaces designing the structure of electronic texts with linear presentation, segmented text, or image files to increase readability and accessibility Containing lists of suggested readings and examples of electronic text Web sites, this book provides you with the opportunity to see how other libraries and scholars are creating and publishing digital texts. From The Text in the Machine, you?ll receive the knowledge to make this medium of information accessible and beneficial to patrons and scholars around the world.
This thesis provides a systematic and integral answer to an open problem concerning the universality of dynamic fuzzy controllers. It presents a number of novel ideas and approaches to various issues including universal function approximation, universal fuzzy models, universal fuzzy stabilization controllers, and universal fuzzy integral sliding mode controllers. The proposed control design criteria can be conveniently verified using the MATLAB toolbox. Moreover, the thesis provides a new, easy-to-use form of fuzzy variable structure control. Emphasis is given to the point that, in the context of deterministic/stochastic systems in general, the authors are in fact discussing non-affine nonlinear systems using a class of generalized T-S fuzzy models, which offer considerable potential in a wide range of applications.
This Festschrift in honour of Ursula Gather's 60th birthday deals with modern topics in the field of robust statistical methods, especially for time series and regression analysis, and with statistical methods for complex data structures. The individual contributions of leading experts provide a textbook-style overview of the topic, supplemented by current research results and questions. The statistical theory and methods in this volume aim at the analysis of data which deviate from classical stringent model assumptions, which contain outlying values and/or have a complex structure. Written for researchers as well as master and PhD students with a good knowledge of statistics.
Just as pilots and doctors improve by studying crash reports and postmortems, experience designers can improve by learning how customer experience failures cause products to fail in the marketplace. Rather than proselytizing a particular approach to design, Why We Fail holistically explores what teams actually built, why the products failed, and how we can learn from the past to avoid failure ourselves.
Imagine how much easier creating web and mobile applications would be if you had a practical and concise, hands-on guide to visual design. "Visual Usability" gets into the nitty-gritty of applying visual design principles to complex application design. You ll learn how to avoid common mistakes, make informed
decisions about application design, and elevate the ordinary. We ll
review three key principles that affect application design -
consistency, hierarchy, and personality - and illustrate how to
apply tools like typography, color, and layout to digital
application design. Whether you re a UI professional looking to
fine-tune your skills, a developer who cares about making
applications beautiful and usable, or someone entirely new to the
design arena, Visual Usability is your one-stop, practical guide to
visual design.
Written to bridge the information needs of management and computational scientists, this book presents the first comprehensive treatment of Computational Red Teaming (CRT). The author describes an analytics environment that blends human reasoning and computational modeling to design risk-aware and evidence-based smart decision making systems. He presents the Shadow CRT Machine, which shadows the operations of an actual system to think with decision makers, challenge threats, and design remedies. This is the first book to generalize red teaming (RT) outside the military and security domains and it offers coverage of RT principles, practical and ethical guidelines. The author utilizes Gilbert's principles for introducing a science. Simplicity: where the book follows a special style to make it accessible to a wide range of readers. Coherence: where only necessary elements from experimentation, optimization, simulation, data mining, big data, cognitive information processing, and system thinking are blended together systematically to present CRT as the science of Risk Analytics and Challenge Analytics. Utility: where the author draws on a wide range of examples, ranging from job interviews to Cyber operations, before presenting three case studies from air traffic control technologies, human behavior, and complex socio-technical systems involving real-time mining and integration of human brain data in the decision making environment. |
You may like...
Database Systems - Design…
Carlos Coronel, Steven Morris
Paperback
Data Communication and Computer Networks…
Jill West, Curt M. White
Paperback
Financial Analysis With Microsoft Excel
Timothy Mayes
Paperback
Modelling and Control in Biomedical…
David Dagan Feng, Janan Zaytoon
Paperback
|