![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Computer software packages
Helping you quickly get up to speed, THE SHELLY CASHMAN SERIES (R) MICROSOFT (R) OFFICE 365 (R) & EXCEL (R) 2021 COMPREHENSIVE 1st edition, includes the latest on Microsoft 365 and Office 2021 features along with enhanced support for Mac users. Its trademark step-by-step, screen-by-screen, project-based approach enables you to expand your understanding of Office applications through hands-on experimentation and critical thinking. Module Learning Objectives are mapped to Microsoft Office Specialist (MOS) certification objectives, reinforcing the critical skills needed for college and career success. Other Ways boxes reveal alternate click paths to achieve a step, while BTW call-outs offer helpful hints as you work through your projects so you can make the most of Microsoft Office tools. In addition, MindTap and SAM (Skills Assessment Manager) online resources help maximize your study time -- and results.
This book provides a quick access to computational tools for algebraic geometry, the mathematical discipline which handles solution sets of polynomial equations. Originating from a number of intense one week schools taught by the authors, the text is designed so as to provide a step by step introduction which enables the reader to get started with his own computational experiments right away. The authors present the basic concepts and ideas in a compact way.
Enter the world of Overwatch, the smash-hit from Blizzard Entertainment, in this five-story anthology chronicling some of the video game's most fascinating and celebrated characters, now available for the first time in print! Since its initial launch in 2016, Overwatch has captivated the imaginations of over 50 million players worldwide. Now fans can join some of the game's most iconic heroes-and villains-on a series of missions ranging from the lush Caribbean to southern India, and everywhere in between. Authored by some of the most compelling voices in science fiction today, including Michael Chu, Brandon Easton, Christie Golden, and Alyssa Wong, this short story anthology is rife with themes of love and loss, ambition and despair, alliances and conflict, all pointing toward a common hope, that the future is worth fighting for.
This volume features selected contributions on a variety of topics related to linear statistical inference. The peer-reviewed papers from the International Conference on Trends and Perspectives in Linear Statistical Inference (LinStat 2016) held in Istanbul, Turkey, 22-25 August 2016, cover topics in both theoretical and applied statistics, such as linear models, high-dimensional statistics, computational statistics, the design of experiments, and multivariate analysis. The book is intended for statisticians, Ph.D. students, and professionals who are interested in statistical inference.
This book provides insight into the challenges in providing data authentication over wireless communication channels. The authors posit that established standard authentication mechanisms - for wired devices - are not sufficient to authenticate data, such as voice, images, and video over wireless channels. The authors propose new mechanisms based on the so-called soft authentication algorithms, which tolerate some legitimate modifications in the data that they protect. The authors explain that the goal of these algorithms is that they are tolerant to changes in the content but are still able to identify the forgeries. The authors go on to describe how an additional advantage of the soft authentication algorithms is the ability to identify the locations of the modifications and correct them if possible. The authors show how to achieve this by protecting the data features with the help of error correcting codes. The correction methods are typically based on watermarking, as the authors discuss in the book. Provides a discussion of data (particularly image) authentication methods in the presence of noise experienced in wireless communication; Presents a new class of soft authentication methods, instead of the standard hard authentication methods, used to tolerate minor changes in image data; Features authentication methods based on the usage of authentication tags as well as digital watermarks.
I3E 2009 was held in Nancy, France, during September 23-25, hosted by Nancy University and INRIA Grand-Est at LORIA. The conference provided scientists andpractitionersofacademia, industryandgovernmentwithaforumwherethey presented their latest ?ndings concerning application of e-business, e-services and e-society, and the underlying technology to support these applications. The 9th IFIP Conference on e-Business, e-Services and e-Society, sponsored by IFIP WG 6.1. of Technical Committees TC6 in cooperation with TC11, and TC8 represents the continuation of previous events held in Zurich (Switzerland) in 2001, Lisbon (Portugal) in 2002, Sao Paulo (Brazil) in 2003, Toulouse (France) in 2004, Poznan (Poland) in 2005, Turku (Finland) in 2006, Wuhan (China) in 2007 and Tokyo (Japan) in 2008. The call for papers attracted papers from 31 countries from the ?ve con- nents. As a result, the I3E 2009 programo?ered 12 sessions of full-paper pres- tations. The 31 selected papers cover a wide and important variety of issues in e-Business, e-servicesande-society, including security, trust, andprivacy, ethical and societal issues, business organization, provision of services as software and software as services, and others. Extended versions of selected papers submitted to I3E 2009 will be published in the International Journal of e-Adoption and in AIS Transactions on Enterprise Systems. In addition, a 500-euros prize was awarded to the authors of the best paper selected by the Program Comm- tee. We thank all authors who submitted their papers, the Program Committee members and external reviewers for their excellent
A comprehensive introduction to various numerical methods used in computational finance today Quantitative skills are a prerequisite for anyone working in finance or beginning a career in the field, as well as risk managers. A thorough grounding in numerical methods is necessary, as is the ability to assess their quality, advantages, and limitations. This book offers a thorough introduction to each method, revealing the numerical traps that practitioners frequently fall into. Each method is referenced with practical, real-world examples in the areas of valuation, risk analysis, and calibration of specific financial instruments and models. It features a strong emphasis on robust schemes for the numerical treatment of problems within computational finance. Methods covered include PDE/PIDE using finite differences or finite elements, fast and stable solvers for sparse grid systems, stabilization and regularization techniques for inverse problems resulting from the calibration of financial models to market data, Monte Carlo and Quasi Monte Carlo techniques for simulating high dimensional systems, and local and global optimization tools to solve the minimization problem.
The world's leading Axapta experts will take you from Axapta novice to pro in this book. This authoritative and comprehensive guide walks you gently through the bulk of what you need to know to productively apply the system in the real worldwith real data, sizing guidelines, deployment architectures, and code. By the book's end, you will have acquired practical hands-on experience. You'll be able to get Axapta up and running, and identify gaps between the out-of-the-box product and your actual business needs. You'll also know how to automate real-world business functions.
Open source software (free software) has emerged as a major field of scientific inquiry across a number of disciplines. When the concept of open source began to gain mindshare in the global business community, decision makers faced a challenge: to convert hype and potential into sustainable profit and viable business models. This volume addresses this challenge through presenting some of the newest, extensively peer-reviewed research in the area.
This book includes a short history of interactive narrative and an account of a small group collaboratively authored social media narrative: Romeo and Juliet on Facebook: After Love Comes Destruction. At the forefront of narrative innovation are social media channels - speculative spaces for creating and experiencing stories that are interactive and collaborative. Media, however, is only the access point to the expressiveness of narrative content. Wikis, messaging, mash-ups, and social media (Facebook, Twitter, YouTube and others) are on a trajectory of participatory story creation that goes back many centuries. These forms offer authors ways to create narrative meaning that reflects our current media culture, as the harlequinade reflected the culture of the 18th century, and as the volvelle reflected that of the 13th century. Interactivity, Collaboration, and Authoring in Social Media first prospects the last millennium for antecedents of today's authoring practices. It does so with a view to considering how today's digital manifestations are a continuation, perhaps a reiteration, perhaps a novel pioneering, of humans' abiding interest in interactive narrative. The book then takes the reader inside the process of creating a collaborative, interactive narrative in today's social media through an authoring experience undertaken by a group of graduate students. The engaging mix of blogs, emails, personal diaries , and fabricated documents used to create the narrative demonstrates that a social media environment can facilitate a meaningful and productive collaborative authorial experience and result in an abundance of networked, personally expressive, and visually and textually referential content. The resulting narrative, After Love Comes Destruction, based in Shakespeare's Romeo and Juliet, shows how a generative narrative space evolved around the students' use of social media in ways they had not previously considered both for authoring and for delivery of their final narrative artifact.
This book offers a comprehensive explanation of iterated function systems and how to use them in generation of complex objects. Discussion covers the most popular fractal models applied in the field of image synthesis; surveys iterated function system models; explores algorithms for creating and manipulating fractal objects, and techniques for implementing the algorithms, and more. The book includes both descriptive text and pseudo-code samples for the convenience of graphics application programmers.
Mathematics is undoubtedly the key to state-of-the-art high technology. It is aninternationaltechnicallanguageandprovestobeaneternallyyoungscience to those who have learned its ways. Long an indispensable part of research thanks to modeling and simulation, mathematics is enjoying particular vit- ity now more than ever. Nevertheless, this stormy development is resulting in increasingly high requirements for students in technical disciplines, while general interest in mathematics continues to wane at the same time. This book and its appendices on the Internet seek to deal with this issue, helping students master the di?cult transition from the receptive to the productive phase of their education. The author has repeatedly held a three-semester introductory course - titled Higher Mathematics at the University of Stuttgart and used a series of "handouts" to show further aspects, make the course contents more motiv- ing, and connect with the mechanics lectures taking place at the same time. One part of the book has more or less evolved from this on its own. True to the original objective, this part treats a variety of separate topics of varying degrees of di?culty; nevertheless, all these topics are oriented to mechanics. Anotherpartofthisbookseekstoo?eraselectionofunderstandablereal- ticmodelsthatcanbeimplementeddirectlyfromthemultitudeofmathema- calresources.TheauthordoesnotattempttohidehispreferenceofNumerical Mathematics and thus places importance on careful theoretical preparation.
"Power SAS: A Survival Guide" is designed to provide the millions of SAS users with the largest and most comprehensive collection of SAS tips and techniques ever offered. Kirk Lafler is an Internet and software consultant with 25 years of experience providing clients around the world with innovative technical solutions and training. Kirk's tips will help you leverage features of SAS that even the most experienced SAS users may not know. Whether you read it cover to cover, browse through it in your free time, or use it as a reference by looking up pertinent tips, this book is an invaluable self-help resource for working smarter, and for troubleshooting and resolving SAS problems and errors. The book's organization makes it easy for SAS users of all experience levelsprogrammers, statisticians, database programmers and administrators, technical managers, technical support staff, and studentsto find what they need. The nine chapters cover SAS basics, data access, data step programming, data manipulation, data management, data presentation, efficiency and performance, configuration and support, and SAS 9.
This book includes 23 papers dealing with the impact of modern information and communication technologies that support a wide variety of communities: local communities, virtual communities, and communities of practice, such as knowledge communities and scientific communities. The volume is the result of the second multidisciplinary "Communities and Technologies Conference," a major event in this emerging research field. The various chapters discuss how communities are affected by technologies, and how understanding of the way that communities function can be used in improving information systems design. This state of the art overview will be of interest to computer and information scientists, social scientists and practitioners alike.
Computer languages and computer graphics have become the primary modes of human-computer interaction. This book provides a basic introduction to "Real and Virtual Environment" computer modelling. Graphics models are used to illustrate both the way computer languages are processed and also used to create computer models of graphic displays. Computer languages have been bootstrapped from machine code, to high-level languages such as Java, to animation scripting languages. Integrating graphic and computer models takes this support for programming, design and simulation work, one step further, allowing interactive computer graphic displays to be used to construct computer models of both real and virtual environment systems. The Java language is used to implement basic algorithms for language translation, and to generate graphic displays. It is also used to simulate the behaviour of a computer system, to explore the way programming and design-simulation environments can be put together.
Celebrate the holidays in Tamriel with this 25-day advent calendar! Inspired by the hit video game series, The Elder Scrolls: The Official Advent Calendar features 25 days of exclusive surprises including keychains, high-quality stickers, recipe cards, mini booklets, and more unique keepsakes. The perfect gift for gamers of all ages, The Elder Scrolls: The Official Advent Calendar brings the epic world of Tamriel to your holiday celebrations!
A "how to" guide for applying statistical methods to biomarker data analysis Presenting a solid foundation for the statistical methods that are used to analyze biomarker data, Analysis of Biomarker Data: A Practical Guide features preferred techniques for biomarker validation. The authors provide descriptions of select elementary statistical methods that are traditionally used to analyze biomarker data with a focus on the proper application of each method, including necessary assumptions, software recommendations, and proper interpretation of computer output. In addition, the book discusses frequently encountered challenges in analyzing biomarker data and how to deal with them, methods for the quality assessment of biomarkers, and biomarker study designs. Covering a broad range of statistical methods that have been used to analyze biomarker data in published research studies, Analysis of Biomarker Data: A Practical Guide also features: A greater emphasis on the application of methods as opposed to the underlying statistical and mathematical theory The use of SAS(R), R, and other software throughout to illustrate the presented calculations for each example Numerous exercises based on real-world data as well as solutions to the problems to aid in reader comprehension The principles of good research study design and the methods for assessing the quality of a newly proposed biomarker A companion website that includes a software appendix with multiple types of software and complete data sets from the book's examples Analysis of Biomarker Data: A Practical Guide is an ideal upper-undergraduate and graduate-level textbook for courses in the biological or environmental sciences. An excellent reference for statisticians who routinely analyze and interpret biomarker data, the book is also useful for researchers who wish to perform their own analyses of biomarker data, such as toxicologists, pharmacologists, epidemiologists, environmental and clinical laboratory scientists, and other professionals in the health and environmental sciences.
This beginner's introduction to MATLAB teaches a sufficient subset of the functionality and gives the reader practical experience on how to find more information. A forty-page appendix contains unique user-friendly summaries and tables of MATLAB functions enabling the reader to find appropriate functions, understand their syntax and get a good overview. The large number of exercises, tips, and solutions mean that the course can be followed with or without a computer. Recent development in MATLAB to advance programming is described using realistic examples in order to prepare students for larger programming projects. Revolutionary step by step 'guided tour' eliminates the steep learning curve encountered in learning new programming languages. Each chapter corresponds to an actual engineering course, where examples in MATLAB illustrate the typical theory, providing a practical understanding of these courses. Complementary homepage contains exercises, a take-home examination, and an automatic marking that grades the solution. End of chapter exercises with selected solutions in an appendix. The development of MATLAB programming and the rapid increase in the use of MATLAB in engineering courses makes this a valuable self-study guide for both engineering students and practising engineers. Readers will find that this time-less material can be used throughout their education and into their career.
Cognitive Intelligence with Neutrosophic Statistics in Bioinformatics investigates and presents the many applications that have arisen in the last ten years using neutrosophic statistics in bioinformatics, medicine, agriculture and cognitive science. This book will be very useful to the scientific community, appealing to audiences interested in fuzzy, vague concepts from which uncertain data are collected, including academic researchers, practicing engineers and graduate students. Neutrosophic statistics is a generalization of classical statistics. In classical statistics, the data is known, formed by crisp numbers. In comparison, data in neutrosophic statistics has some indeterminacy. This data may be ambiguous, vague, imprecise, incomplete, and even unknown. Neutrosophic statistics refers to a set of data, such that the data or a part of it are indeterminate in some degree, and to methods used to analyze the data.
Given the explosion of interest in mathematical methods for solving problems in finance and trading, a great deal of research and development is taking place in universities, large brokerage firms, and in the supporting trading software industry. Mathematical advances have been made both analytically and numerically in finding practical solutions. This book provides a comprehensive overview of existing and original material, about what mathematics when allied with Mathematica can do for finance. Sophisticated theories are presented systematically in a user-friendly style, and a powerful combination of mathematical rigor and Mathematica programming. Three kinds of solution methods are emphasized: symbolic, numerical, and Monte-- Carlo. Nowadays, only good personal computers are required to handle the symbolic and numerical methods that are developed in this book. Key features: * No previous knowledge of Mathematica programming is required * The symbolic, numeric, data management and graphic capabilities of Mathematica are fully utilized * Monte--Carlo solutions of scalar and multivariable SDEs are developed and utilized heavily in discussing trading issues such as Black--Scholes hedging * Black--Scholes and Dupire PDEs are solved symbolically and numerically * Fast numerical solutions to free boundary problems with details of their Mathematica realizations are provided * Comprehensive study of optimal portfolio diversification, including an original theory of optimal portfolio hedging under non-Log-Normal asset price dynamics is presented The book is designed for the academic community of instructors and students, and most importantly, will meet the everyday trading needs of quantitatively inclined professional and individual investors.
The recent pursuits emerging in the realm of big data processing, interpretation, collection and organization have emerged in numerous sectors including business, industry and government organizations. Data sets such as customer transactions for a mega-retailer, weathermonitoring, intelligence gathering, quickly outpace the capacities of traditional techniques and tools of data analysis. The 3V (volume, variability and velocity) challenges led to the emergence of new techniques and tools in data visualization, acquisition, and serialization. Soft Computing being regarded as a plethora of technologies of fuzzy sets (or Granular Computing), neurocomputing and evolutionary optimization brings forward a number of unique features that might be instrumental to the development of concepts and algorithms to deal with big data. This carefully edited volume provides the reader with an updated, in-depth material on the emerging principles, conceptual underpinnings, algorithms and practice of Computational Intelligence in the realization of concepts and implementation of big data architectures, analysis, and interpretation as well as data analytics. The book is aimed at a broad audience of researchers and practitioners including those active in various disciplines in which big data, their analysis and optimization are of genuine relevance. One focal point is the systematic exposure of the concepts, design methodology, and detailed algorithms. In general, the volume adheres to the top-down strategy starting with the concepts and motivation and then proceeding with the detailed design that materializes in specific algorithms and representative applications. The material is self-contained and provides the reader with all necessary prerequisites and augments some parts with a step-by-step explanation of more advanced concepts supported by a significant amount of illustrative numeric material and some application scenarios to motivate the reader and make some abstract concepts more tangible."
Correcting the Great Mistake People often mistake one thing for another. That's human nature. However, one would expect the leaders in a particular ?eld of endeavour to have superior ab- ities to discriminate among the developments within that ?eld. That is why it is so perplexing that the technology elite - supposedly savvy folk such as software developers, marketers and businessmen - have continually mistaken Web-based graphics for something it is not. The ?rst great graphics technology for the Web, VRML, has been mistaken for something else since its inception. Viewed variously as a game system, a format for architectural walkthroughs, a platform for multi-user chat and an augmentation of reality, VRML may qualify as the least understood invention in the history of inf- mation technology. Perhaps it is so because when VRML was originally introduced it was touted as a tool for putting the shopping malls of the world online, at once prosaic and horrifyingly mundane to those of us who were developing it. Perhaps those ?rst two initials,"VR,"created expectations of sprawling, photorealistic f- tasy landscapes for exploration and play across the Web. Or perhaps the magnitude of the invention was simply too great to be understood at the time by the many, ironically even by those spending the money to underwrite its development. Regardless of the reasons, VRML suffered in the mainstream as it was twisted to meet unintended ends and stretched far beyond its limitations. |
You may like...
Revitalizing Minority Languages - New…
Michael Hornsby
Hardcover
Practical Industrial Data Communications…
Deon Reynders, Steve Mackay, …
Paperback
R1,452
Discovery Miles 14 520
|