Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Computing & IT > Computer programming
Game Audio Fundamentals takes the reader on a journey through game audio design: from analog and digital audio basics, to the art and execution of sound effects, soundtracks, and voice production, as well as learning how to make sense of a truly effective soundscape. Presuming no pre-existing knowledge, this accessible guide is accompanied by online resources - including practical examples and incremental DAW exercises - and presents the theory and practice of game audio in detail, and in a format anyone can understand. This is essential reading for any aspiring game audio designer, as well as students and professionals from a range of backgrounds, including music, audio engineering, and game design.
Robust Technology with Analysis of Interference in Signal Processing discusses for the first time the theoretical fundamentals and algorithms of analysis of noise as an information carrier. On their basis the robust technology of noisy signals processing is developed. This technology can be applied to solving the problems of control, identification, diagnostics, and pattern recognition in petrochemistry, energetics, geophysics, medicine, physics, aviation, and other sciences and industries. The text explores the emergent possibility of forecasting failures on various objects, in conjunction with the fact that failures follow the hidden microchanges revealed via interference estimates. This monograph is of interest to students, postgraduates, engineers, scientific associates and others who are concerned with the processing of measuring information on computers.
Recent developments in computer science clearly show the need for a
better theoretical foundation for some central issues. Methods and
results from mathematical logic, in particular proof theory and
model theory, are of great help here and will be used much more in
future than previously. This book provides an excellent
introduction to the interplay of mathematical logic and computer
science. It contains extensively reworked versions of the lectures
given at the 1997 Marktoberdorf Summer School by leading
researchers in the field.
Web Dynpro ABAP, a NetWeaver web application user interface tool from SAP, enables web programming connected to SAP Systems. The authors' main focus was to create a book based on their own practical experience. Each chapter includes examples which lead through the content step-by-step and enable the reader to gradually explore and grasp the Web Dynpro ABAP process. The authors explain in particular how to design Web Dynpro components, the data binding and interface methods, and the view controller methods. They also describe the other SAP NetWeaver Elements (ABAP Dictionary, Authorization) and the integration of the Web Dynpro Application into the SAP NetWeaver Portal. The new edition has been expanded to include chapters on subjects such as POWER Lists; creating the Modal Windows and External Windows; using Web Dynpro application parameters and Shared Objects to communicate between the Web Dynpro ABAP Application and Business Server Pages; and creating multi-language mails using Web Dynpro ABAP.
On August 1997 a conference titled "From Local to Global Optimiza- tion" was held at Storgarden in Rimfor.sa near the Linkoping Institute of Technology, Sweden. The conference gave us the opportunity to cel- ebrate Hoang Thy's achievements in Optimization during his 70 years of life. This book consists of a collection of research papers based on results presented during the conference and are dedicated to Professor Hoang Thy on the occasion of his 70th birthday. The papers cover a wide range of recent results in Mathematical Pro- gramming. The work of Hoang Thy, in particular in Global Optimiza- tion, has provided directions for new algorithmic developments in the field. We are indebted to the Kluwer Academic Publishers for inviting us to publish this volume, and the Center for Industrial Information Transfer (CENIIT) for financial support. We wish to thank the referees for their help and the authors for their papers. We also wish to join all contributors of this book in expressing birthday wishes and gratitude to Hoang Thy for his inspiration, support, and friendship to all of us. Athanasios Migdalas, Panos M. Pardalos, and Peter Varbrand November 1998 xv Hoang Tuy: An Appreciation Its a pleasure for me as colleague and friend to take this opportunity to celebrate Hoang 'I\lY'S numerous contributions to the field of mathemat- ical programming.
This volume describes and analyzes in a systematic way the great contributions of the philosopher Krister Segerberg to the study of real and doxastic actions. Following an introduction which functions as a roadmap to Segerberg's works on actions, the first part of the book covers relations between actions, intentions and routines, dynamic logic as a theory of action, agency, and deontic logics built upon the logics of actions. The second section explores belief revision and update, iterated and irrevocable beliefs change, dynamic doxastic logic and hypertheories. Segerberg has worked for more than thirty years to analyze the intricacies of real and doxastic actions using formal tools - mostly modal (dynamic) logic and its semantics. He has had such a significant impact on modal logic that "It is hard to roam for long in modal logic without finding Krister Segerberg's traces," as Johan van Benthem notes in his chapter of this book.
These proceedings contain the papers of IFIP/SEC 2010. It was a special honour and privilege to chair the Program Committee and prepare the proceedings for this conf- ence, which is the 25th in a series of well-established international conferences on security and privacy organized annually by Technical Committee 11 (TC-11) of IFIP. Moreover, in 2010 it is part of the IFIP World Computer Congress 2010 celebrating both the Golden Jubilee of IFIP (founded in 1960) and the Silver Jubilee of the SEC conference in the exciting city of Brisbane, Australia, during September 20-23. The call for papers went out with the challenging motto of "Security & Privacy Silver Linings in the Cloud" building a bridge between the long standing issues of security and privacy and the most recent developments in information and commu- cation technology. It attracted 102 submissions. All of them were evaluated on the basis of their significance, novelty, and technical quality by at least five member of the Program Committee. The Program Committee meeting was held electronically over a period of a week. Of the papers submitted, 25 were selected for presentation at the conference; the acceptance rate was therefore as low as 24. 5% making SEC 2010 a highly competitive forum. One of those 25 submissions could unfortunately not be included in the proceedings, as none of its authors registered in time to present the paper at the conference.
Learn how to use C++ to transform program logic and design concepts into working programs with Smith's C++ PROGRAMS TO ACCOMPANY PROGRAMMING LOGIC AND DESIGN, 8E. Specifically designed to be paired with the latest edition of Farrell's highly successful PROGRAMMING LOGIC AND DESIGN, this new guide combine the power of C++ with the popular, language-independent, logical approach of the PROGRAMMING LOGIC AND DESIGN text. Together, the two books provide the perfect opportunity for readers to learn the fundamentals of programming, while also learning an actual leading programming language.
Advanced approaches to software engineering and design are capable of solving complex computational problems and achieving standards of performance that were unheard of only decades ago. Handbook of Research on Emerging Advancements and Technologies in Software Engineering presents a comprehensive investigation of the most recent discoveries in software engineering research and practice, with studies in software design, development, implementation, testing, analysis, and evolution. Software designers, architects, and technologists, as well as students and educators, will find this book to be a vital and in-depth examination of the latest notable developments within the software engineering community.
With the purpose of building upon standard web technologies, open linked data serves as a useful way to connect previously unrelated data and to publish structured data on the web. The application of these elements leads to the creation of data commons called semantic web. Cases on Open-Linked Data and Semantic Web Applications brings together new theories, research findings and case studies which cover the recent developments and approaches towards applied open linked data and semantic web in the context of information systems. By enhancing the understanding of open linked data in business, science and information technologies, this reference source aims to be useful for academics, researchers, and practitioners. With the purpose of building upon standard web technologies, open linked data serves as a useful way to connect previously unrelated data and to publish structured data on the web. The application of these elements leads to the creation of data commons called semantic web.
From a review of the Second Edition
Asynchronous On-Chip Networks and Fault-Tolerant Techniques is the first comprehensive study of fault-tolerance and fault-caused deadlock effects in asynchronous on-chip networks, aiming to overcome these drawbacks and ensure greater reliability of applications. As a promising alternative to the widely used synchronous on-chip networks for multicore processors, asynchronous on-chip networks can be vulnerable to faults even if they can deliver the same performance with much lower energy and area compared with their synchronous counterparts - faults can not only corrupt data transmission but also cause a unique type of deadlock. By adopting a new redundant code along with a dynamic fault detection and recovery scheme, the authors demonstrate that asynchronous on-chip networks can be efficiently hardened to tolerate both transient and permanent faults and overcome fault-caused deadlocks. This book will serve as an essential guide for researchers and students studying interconnection networks, fault-tolerant computing, asynchronous system design, circuit design and on-chip networking, as well as for professionals interested in designing fault-tolerant and high-throughput asynchronous circuits.
'Visual Languages for Interactive Computing' presents problems and methodologies related to the syntax, semantics, and ambiguities of visual languages.
Today, computers fulfil a dazzling array of roles, a flexibility resulting from the great range of programs that can be run on them. "A Science of Operations" examines the history of what we now call programming, defined not simply as "computer" programming, but more broadly as the definition of the steps involved in computations and other information-processing activities. This unique perspective highlights how the history of programming is distinct from the history of the computer, despite the close relationship between the two in the 20th century. The book also discusses how the development of programming languages is related to disparate fields which attempted to give a mechanical account of language on the one hand, and a linguistic account of machines on the other. Topics and features: Covers the early development of automatic computing, including Babbage's "mechanical calculating engines" and the applications of punched-card technology, examines the theoretical work of mathematical logicians such as Kleene, Church, Post and Turing, and the machines built by Zuse and Aiken in the 1930s and 1940s, discusses the role that logic played in the development of the stored program computer, describes the "standard model" of machine-code programming popularised by Maurice Wilkes, presents the complete table for the universal Turing machine in the Appendices, investigates the rise of the initiatives aimed at developing higher-level programming notations, and how these came to be thought of as 'languages' that could be studied independently of a machine, examines the importance of the Algol 60 language, and the framework it provided for studying the design of programming languages and the process of software development and explores the early development of object-oriented languages, with a focus on the Smalltalk project. This fascinating text offers a new viewpoint for historians of science and technology, as well as for the general reader. The historical narrative builds the story in a clear and logical fashion, roughly following chronological order.
.NET represents a new and improved way of developing software for the Windows platform. Given the chance, youd probably rewrite all of your existing code in the newer managed code environment that .NET provides. But it is difficult or impossible to throw out all existing legacy code and start over when a new technology arrives. Instead, you need to find a way to move forward with new .NET development while reusing existing pieces of tested, working code. You need a way to interoperate with the existing code until you have a chance to finally rewrite all of it in .NET. The only recipe-style book on the subject,,"NET 2.0 Interoperability Recipes: A Problem-Solution Approach" guides Windows developers who are transitioning from native Windows code to .NET managed code. The book Explains new interop features in .NET 2.0 and VS .NET 2005 Covers PInvoke, COM, and COM+ (other books dont cover all three areas) Features most of its example code in C# and VB .NET, and also includes some managed C++/CLI Is written by a working developer with first-hand experience .NET tools will allow you to interoperate with existing code. But finding the appropriate tool for the task at hand can sometimes be a frustrating experience. So this book will guide you past myriad infrequently used interop options to focus on those youll use most often.
In this book, the author considers separable programming and, in particular, one of its important cases - convex separable programming. Some general results are presented, techniques of approximating the separable problem by linear programming and dynamic programming are considered. Convex separable programs subject to inequality/ equality constraint(s) and bounds on variables are also studied and iterative algorithms of polynomial complexity are proposed. As an application, these algorithms are used in the implementation of stochastic quasigradient methods to some separable stochastic programs. Numerical approximation with respect to I1 and I4 norms, as a convex separable nonsmooth unconstrained minimization problem, is considered as well. Audience: Advanced undergraduate and graduate students, mathematical programming/ operations research specialists.
Programming in Scala is the definitive book on Scala, the popular language for the Java platform that blends object-oriented and functional programming concepts into a unique and powerful tool for developers. The fifth edition has been updated to cover new features up to, and including, Scala version 3.0. The Scala language has been exploding in popularity in recent years. More than 54,000 copies of Programming in Scala have been sold since the first edition was published in 2008.
What will business software look like in the future? And how will it be developed? This book covers the proceedings of the first international conference on Future Business Software - a new think tank discussing the trends in enterprise software with speakers from Europe's most successful software companies and the leading research institutions. The articles focus on two of the most prominent trends in the field: emergent software and agile development processes. "Emergent Software" is a new paradigm of software development that addresses the highly complex requirements of tomorrow's business software and aims at dynamically and flexibly combining a business software solution's different components in order to fulfill customers' needs with a minimum of effort. Agile development processes are the response of software technology to the implementation of diverse and rapidly changing software requirements. A major focus is on the minimization of project risks, e.g. through short, iterative development cycles, test-driven development and an intensive culture of communication."
R/3 is a business system that has gained global prominence. However, the SAP R/3 has 237,000 function modules. Quite oftenprogrammersare unaware that a module exists which can be of help in their programs. This convenient resource is a collection of the most common ABAP modules, demonstrated within simple programs. These programs for easily searchable examples can be accessed from http: //extras.springer.com/978-1-85233-775-9 The modules in this book are organised for quick reference. This concise reference contains: A full explanation of the layout of reference entries;a brief introduction to SAP; coverage of conversion and date and time modules; file and directory modules; list, long texts, and number modules; useful integration modules for MSOffice and pop-up dialog box management. This book organises over 300 modules, many of which are undocumented in text, and arranges them for quick and easy reference, and explains when and where to use the most common SAP R/3 ABAP function modules. "
Process Technology brings together in one place important contributions and up-to-date research results in this fast moving area. Process Technology serves as an excellent reference, providing insight into some of the most challenging research issues in the field.
This indispensable text introduces the foundations of three-dimensional computer vision and describes recent contributions to the field. Fully revised and updated, this much-anticipated new edition reviews a range of triangulation-based methods, including linear and bundle adjustment based approaches to scene reconstruction and camera calibration, stereo vision, point cloud segmentation, and pose estimation of rigid, articulated, and flexible objects. Also covered are intensity-based techniques that evaluate the pixel grey values in the image to infer three-dimensional scene structure, and point spread function based approaches that exploit the effect of the optical system. The text shows how methods which integrate these concepts are able to increase reconstruction accuracy and robustness, describing applications in industrial quality inspection and metrology, human-robot interaction, and remote sensing.
This 2nd edition textbook has been expanded to include of 175 additional pages of additional content, created in response to readers feedback, as well as to new hardware and software releases. The book presents foundational robotics concepts using the ROBOTIS BIOLOID and OpenCM-904 robotic systems, and is suitable as a curriculum for a first course in robotics for undergraduate students or a self-learner. It covers wheel-based robots, as well as walking robots. Although it uses the standard "Sense, Think, Act" approach, communications (bot-to-bot and PC-to-bot) programming concepts are treated in more depth (wired and wireless ZigBee/BlueTooth). Algorithms are developed and described via ROBOTIS' proprietary RoboPlus IDE, as well as the more open Arduino-based Embedded C environments. Additionally, a vast array of web-based multimedia materials are used for illustrating robotics concepts, code implementations and videos of actual resulting robot behaviors. Advanced sensor interfacing for gyroscope, inertial measuring unit, foot pressure sensor and color camera are also demonstrated.
This book deals with the theory and applications of the Reformulation- Linearization/Convexification Technique (RL T) for solving nonconvex optimization problems. A unified treatment of discrete and continuous nonconvex programming problems is presented using this approach. In essence, the bridge between these two types of nonconvexities is made via a polynomial representation of discrete constraints. For example, the binariness on a 0-1 variable x . can be equivalently J expressed as the polynomial constraint x . (1-x . ) = 0. The motivation for this book is J J the role of tight linear/convex programming representations or relaxations in solving such discrete and continuous nonconvex programming problems. The principal thrust is to commence with a model that affords a useful representation and structure, and then to further strengthen this representation through automatic reformulation and constraint generation techniques. As mentioned above, the focal point of this book is the development and application of RL T for use as an automatic reformulation procedure, and also, to generate strong valid inequalities. The RLT operates in two phases. In the Reformulation Phase, certain types of additional implied polynomial constraints, that include the aforementioned constraints in the case of binary variables, are appended to the problem. The resulting problem is subsequently linearized, except that certain convex constraints are sometimes retained in XV particular special cases, in the Linearization/Convexijication Phase. This is done via the definition of suitable new variables to replace each distinct variable-product term. The higher dimensional representation yields a linear (or convex) programming relaxation.
Industrial development of software systems needs to be guided by recognized engineering principles. Commercial-off-the-shelf (COTS) components enable the systematic and cost-effective reuse of prefabricated tested parts, a characteristic approach of mature engineering disciplines. This reuse necessitates a thorough test of these components to make sure that each works as specified in a real context. Beydeda and Gruhn invited leading researchers in the area of component testing to contribute to this monograph, which covers all related aspects from testing components in a context-independent manner through testing components in the context of a specific system to testing complete systems built from different components. The authors take the viewpoints of both component developers and component users, and their contributions encompass functional requirements such as correctness and functionality compliance as well as non-functional requirements like performance and robustness. Overall this monograph offers researchers, graduate students and advanced professionals a unique and comprehensive overview of the state of the art in testing COTS components and COTS-based systems. |
You may like...
Data Abstraction and Problem Solving…
Janet Prichard, Frank Carrano
Paperback
R2,163
Discovery Miles 21 630
Writing Better Requirements - Writing…
Ian Alexander, Richard Stevens
Paperback
R2,122
Discovery Miles 21 220
Java How to Program, Late Objects…
Paul Deitel, Harvey Deitel
Paperback
An Introduction to XML and Web…
Anders Moller, Michael Schwartzbach
Paperback
R2,413
Discovery Miles 24 130
|