Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Computing & IT > Computer software packages
Proceedings of the 2012 International Conference on Information Technology and Software Engineering presents selected articles from this major event, which was held in Beijing, December 8-10, 2012. This book presents the latest research trends, methods and experimental results in the fields of information technology and software engineering, covering various state-of-the-art research theories and approaches. The subjects range from intelligent computing to information processing, software engineering, Web, unified modeling language (UML), multimedia, communication technologies, system identification, graphics and visualizing, etc. The proceedings provide a major interdisciplinary forum for researchers and engineers to present the most innovative studies and advances, which can serve as an excellent reference work for researchers and graduate students working on information technology and software engineering. Prof. Wei Lu, Dr. Guoqiang Cai, Prof. Weibin Liu and Dr. Weiwei Xing all work at Beijing Jiaotong University.
This book was written to provide resource materials for teachers to use in their introductory or intermediate statistics class. The chapter content is ordered along the lines of many popular statistics books so it should be easy to supplement the content and exercises with class lecture materials. The book contains R script programs to demonstrate important topics and concepts covered in a statistics course, including probability, random sampling, population distribution types, role of the Central Limit Theorem, creation of sampling distributions for statistics, and more. The chapters contain T/F quizzes to test basic knowledge of the topics covered. In addition, the book chapters contain numerous exercises with answers or solutions to the exercises provided. The chapter exercises reinforce an understanding of the statistical concepts presented in the chapters. An instructor can select any of the supplemental materials to enhance lectures and/or provide additional coverage of concepts and topics in their statistics book. This book uses the R statistical package which contains an extensive library of functions. The R software is free and easily downloaded and installed. The R programs are run in the R Studio software which is a graphical user interface for Windows. The R Studio software makes accessing R programs, viewing output from the exercises, and graphical displays easier to manage. The first chapter of the book covers the fundamentals of the R statistical package. This includes installation of R and R Studio, accessing R packages and libraries of functions. The chapter also covers how to access manuals and technical documentation, as well as, basic R commands used in the R script programs in the chapters. This chapter is important for the instructor to master so that the software can be installed and the R script programs run. The R software is free so students can also install the software and run the R script programs in the chapters. Teachers and students can run the R software on university computers, at home, or on laptop computers making it more available than many commercial software packages. "
It is said that business re-engineering is part of our transition
to a post-industrial society. The purpose of this book is to
present an approach to how to reorganize businesses using the
discipline of software engineering as a guiding paradigm. The
author's thesis is that software engineering provides the necessary
analytical expertise for defining business processes and the tools
to transform process descriptions to support systems.
Text classification is becoming a crucial task to analysts in different areas. In the last few decades, the production of textual documents in digital form has increased exponentially. Their applications range from web pages to scientific documents, including emails, news and books. Despite the widespread use of digital texts, handling them is inherently difficult - the large amount of data necessary to represent them and the subjectivity of classification complicate matters. This book gives a concise view on how to use kernel approaches for inductive inference in large scale text classification; it presents a series of new techniques to enhance, scale and distribute text classification tasks. It is not intended to be a comprehensive survey of the state-of-the-art of the whole field of text classification. Its purpose is less ambitious and more practical: to explain and illustrate some of the important methods used in this field, in particular kernel approaches and techniques.
Monte Carlo statistical methods, particularly those based on Markov chains, are now an essential component of the standard set of techniques used by statisticians. This new edition has been revised towards a coherent and flowing coverage of these simulation techniques, with incorporation of the most recent developments in the field. In particular, the introductory coverage of random variable generation has been totally revised, with many concepts being unified through a fundamental theorem of simulation There are five completely new chapters that cover Monte Carlo control, reversible jump, slice sampling, sequential Monte Carlo, and perfect sampling. There is a more in-depth coverage of Gibbs sampling, which is now contained in three consecutive chapters. The development of Gibbs sampling starts with slice sampling and its connection with the fundamental theorem of simulation, and builds up to two-stage Gibbs sampling and its theoretical properties. A third chapter covers the multi-stage Gibbs sampler and its variety of applications. Lastly, chapters from the previous edition have been revised towards easier access, with the examples getting more detailed coverage. This textbook is intended for a second year graduate course, but will also be useful to someone who either wants to apply simulation techniques for the resolution of practical problems or wishes to grasp the fundamental principles behind those methods. The authors do not assume familiarity with Monte Carlo techniques (such as random variable generation), with computer programming, or with any Markov chain theory (the necessary concepts are developed in Chapter 6). A solutions manual, which coversapproximately 40% of the problems, is available for instructors who require the book for a course. Christian P. Robert is Professor of Statistics in the Applied Mathematics Department at UniversitA(c) Paris Dauphine, France. He is also Head of the Statistics Laboratory at the Center for Research in Economics and Statistics (CREST) of the National Institute for Statistics and Economic Studies (INSEE) in Paris, and Adjunct Professor at Ecole Polytechnique. He has written three other books, including The Bayesian Choice, Second Edition, Springer 2001. He also edited Discretization and MCMC Convergence Assessment, Springer 1998. He has served as associate editor for the Annals of Statistics and the Journal of the American Statistical Association. He is a fellow of the Institute of Mathematical Statistics, and a winner of the Young Statistician Award of the SocietiA(c) de Statistique de Paris in 1995. George Casella is Distinguished Professor and Chair, Department of Statistics, University of Florida. He has served as the Theory and Methods Editor of the Journal of the American Statistical Association and Executive Editor of Statistical Science. He has authored three other textbooks: Statistical Inference, Second Edition, 2001, with Roger L. Berger; Theory of Point Estimation, 1998, with Erich Lehmann; and Variance Components, 1992, with Shayle R. Searle and Charles E. McCulloch. He is a fellow of the Institute of Mathematical Statistics and the American Statistical Association, and an elected fellow of the International Statistical Institute.
Collecting the work of the foremost scientists in the field, Discrete-Event Modeling and Simulation: Theory and Applications presents the state of the art in modeling discrete-event systems using the discrete-event system specification (DEVS) approach. It introduces the latest advances, recent extensions of formal techniques, and real-world examples of various applications. The book covers many topics that pertain to several layers of the modeling and simulation architecture. It discusses DEVS model development support and the interaction of DEVS with other methodologies. It describes different forms of simulation supported by DEVS, the use of real-time DEVS simulation, the relationship between DEVS and graph transformation, the influence of DEVS variants on simulation performance, and interoperability and composability with emphasis on DEVS standardization. The text also examines extensions to DEVS, new formalisms, and abstractions of DEVS models as well as the theory and analysis behind real-world system identification and control. To support the generation and search of optimal models of a system, a framework is developed based on the system entity structure and its transformation to DEVS simulation models. In addition, the book explores numerous interesting examples that illustrate the use of DEVS to build successful applications, including optical network-on-chip, construction/building design, process control, workflow systems, and environmental models. A one-stop resource on advances in DEVS theory, applications, and methodology, this volume offers a sampling of the best research in the area, a broad picture of the DEVS landscape, and trend-setting applications enabled by the DEVS approach. It provides the basis for future research discoveries and encourages the development of new applications.
This book includes the proceedings of the fourth workshop on recommender systems in fashion and retail (2022), and it aims to present a state-of-the-art view of the advancements within the field of recommendation systems with focused application to e-commerce, retail, and fashion by presenting readers with chapters covering contributions from academic as well as industrial researchers active within this emerging new field. Recommender systems are often used to solve different complex problems in this scenario, such as product recommendations, size and fit recommendations, and social media-influenced recommendations (outfits worn by influencers).
Enterprise Information Systems Design, Implementation and Management: Organizational Applications investigates the creation and implementation of enterprise information systems. Covering a wide array of topics such as flow-shop scheduling, information systems outsourcing, ERP systems utilization, Dietz transaction methodology, and advanced planning systems, it is an essential reference source for researchers and professionals alike.
AI is an integral part of every video game and this book helps game developers keep up with the constantly evolving technological advances to create robust AI. The authors draw on their considerable experience and uses case studies from real games to provide a complete reference. Also included are exercises so readers can test their comprehension and understanding of the concepts and practices presented. This revised and updated Third Edition includes new techniques, algorithms, data structures and representations needed to create powerful AI in games. It helps experienced game developers learn new techniques and provides students with a solid understanding of game AI that will help them jumpstart their careers.
This is a visual quick reference book that shows how to get the most out of Office 2010 applciations, particularly if you haven't used the Office suite before.
This volume provides an overview of multimedia data mining and knowledge discovery and discusses the variety of hot topics in multimedia data mining research. It describes the objectives and current tendencies in multimedia data mining research and their applications. Each part contains an overview of its chapters and leads the reader with a structured approach through the diverse subjects in the field.
In an increasingly globalised world, despite reductions in costs and time, transportation has become even more important as a facilitator of economic and human interaction; this is reflected in technical advances in transportation systems, increasing interest in how transportation interacts with society and the need to provide novel approaches to understanding its impacts. This has become particularly acute with the impact that Covid-19 has had on transportation across the world, at local, national and international levels. Encyclopedia of Transportation, Seven Volume Set - containing almost 600 articles - brings a cross-cutting and integrated approach to all aspects of transportation from a variety of interdisciplinary fields including engineering, operations research, economics, geography and sociology in order to understand the changes taking place. Emphasising the interaction between these different aspects of research, it offers new solutions to modern-day problems related to transportation. Each of its nine sections is based around familiar themes, but brings together the views of experts from different disciplinary perspectives. Each section is edited by a subject expert who has commissioned articles from a range of authors representing different disciplines, different parts of the world and different social perspectives. The nine sections are structured around the following themes: Transport Modes; Freight Transport and Logistics; Transport Safety and Security; Transport Economics; Traffic Management; Transport Modelling and Data Management; Transport Policy and Planning; Transport Psychology; Sustainability and Health Issues in Transportation. Some articles provide a technical introduction to a topic whilst others provide a bridge between topics or a more future-oriented view of new research areas or challenges. The end result is a reference work that offers researchers and practitioners new approaches, new ways of thinking and novel solutions to problems. All-encompassing and expertly authored, this outstanding reference work will be essential reading for all students and researchers interested in transportation and its global impact in what is a very uncertain world.
The book serves as a collection of multi-disciplinary contributions related to Geographic Hypermedia and highlights the technological aspects of GIS. Specifically, it focuses on its database and database management system. The methodologies for modeling and handling geographic data are described. It presents the novel models, methods and tools applied in Spatial Decision Support paradigm.
This book provides insight and enhanced appreciation of analysis, modeling and control of dynamic systems. The reader is assumed to be familiar with calculus, physics and some programming skills. It might develop the reader's ability to interpret physical significance of mathematical results in system analysis. The book also prepares the reader for more advanced treatment of subsequent knowledge in the automatic control field. Learning objectives are performance-oriented, using for this purpose interactive MATLAB and SIMULINK software tools. It presents realistic problems in order to analyze, design and develop automatic control systems. Learning with computing tools can aid theory and help students to think, analyze and reason in meaningful ways. The book is also complemented with classroom slides and MATLAB and SIMULINK exercise files to aid students to focus on fundamental concepts treated.
This text compiles research from a vibrant social simulation community of researchers who have developed unique and innovative approaches to social simulation.
In 1982, Professor Pawlak published his seminal paper on what he called "rough sets" - a work which opened a new direction in the development of theories of incomplete information. Today, a decade and a half later, the theory of rough sets has evolved into a far-reaching methodology for dealing with a wide variety of issues centering on incompleteness and imprecision of information - issues which playa key role in the conception and design of intelligent information systems. "Incomplete Information: Rough Set Analysis" - or RSA for short - presents an up-to-date and highly authoritative account of the current status of the basic theory, its many extensions and wide-ranging applications. Edited by Professor Ewa Orlowska, one of the leading contributors to the theory of rough sets, RSA is a collection of nineteen well-integrated chapters authored by experts in rough set theory and related fields. A common thread that runs through these chapters ties the concept of incompleteness of information to those of indiscernibility and similarity.
"JDBC Metadata, MySQL, and Oracle Recipes" is the only book that focuses on metadata or annotation-based code recipes for JDBC API for use with Oracle and MySQL. It continues where the authors other book, "JDBC Recipes: A Problem-Solution Approach," leaves off. This edition is also a Java EE 5-compliant book, perfect for lightweight Java database development. And it provides cut-and-paste code templates that can be immediately customized and applied in each developer's application development.
Video on Demand Systems brings together in one place important contributions and up-to-date research results in this fast moving area. Video on Demand Systems serves as an excellent reference, providing insight into some of the most challenging research issues in the field.
The hunt for new forms of value generation is shaping the future of economic and financial interactions, leading to the emergence of innovative business models and technological enablers. Other than challenging our time and space limits, such technological advancements, in some cases, have allowed the generation of value at nearly zero marginal cost. Inevitably, emergent tech solutions are fundamental game changers in digital and conventional finance. In this regard, the book fleshes out the core developments and trending fintech 2.0 solutions that pause challenges and bring opportunities for businesses and economies. It comprises nine main chapters with collective insights and interdisciplinary perspectives covering the business, tech, and regulatory layers of financial technologies and decentralized finance. Besides, the book illustrates how to leverage these state-of-the-art technologies for the evolving digital and decentralized finance world. The book targets a broad audience of researchers, academia, industry professionals, fintech enthusiasts, and the general business audience with timely data and up-to-date cases.
Introduction to Mathcad 15, 3/e is ideal for Freshman or Introductory courses in Engineering and Computer Science. Introduces Mathcad's basic mathematical and data analysis functions (e.g., trigonometric, regression, and interpolation functions) using easy-to-follow examples, then applies the functions to examples drawn from emerging or rapidly developing fields in engineering. ESource-Prentice Hall's Engineering Source-provides a complete, flexible introductory engineering and computing program. ESource allows professors to fully customize their textbooks through the ESource website. Professors are not only able to pick and choose modules, but also sections of modules, incorporate their own materials, and re-paginate and re-index the complete project. prenhall.com/esource
Mastering modelling, and in particular numerical models, is becoming a crucial and central question in modern computational mechanics. Various tools, able to quantify the quality of a model with regard to another one taken as the reference, have been derived. Applied to computational strategies, these tools lead to new computational methods which are called "adaptive." The present book is concerned with outlining the state of the art and the latest advances in both these important areas. Papers are selected from a Workshop (Cachan 17-19 September 1997) which is the third of a series devoted to Error Estimators and Adaptivity in Computational Mechanics. The Cachan Workshop dealt with latest advances in adaptive computational methods in mechanics and their impacts on solving engineering problems. It was centered too on providing answers to simple questions such as: what is being used or can be used at present to solve engineering problems? What should be the state of art in the year 2000? What are the new questions involving error estimators and their applications?
|
You may like...
|