![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > Computer software packages
The physics of metal forming and metal removing is normally expressed using non-linear partial differential equations which can be solved using the finite element method (FEM). However, when the process parameters are uncertain and/or the physics of the process is not well understood, soft computing techniques can be used with FEM or alone to model the process. Using FEM, fuzzy set theory and neural networks as modeling tools; Modeling of Metal Forming and Machining Processes provides a complete treatment of metal forming and machining, and includes: a [ an explanation of FEM and its application to the modeling of manufacturing processes; a [ a discussion of the numerical difficulties of FEM; a [ chapters on the application of soft computing techniques in this modeling process. The algorithms and solved examples included make Modeling of Metal Forming and Machining Processes of value to postgraduates, senior undergraduates, lecturers and researchers in these fields. R&D engineers and consultants for the manufacturing industry will also find it of use.
In the early days of the Web a need was recognized for a language
to display 3D objects through a browser. An HTML-like language,
VRML, was proposed in 1994 and became the standard for describing
interactive 3D objects and worlds on the Web. 3D Web courses were
started, several best-selling books were published, and VRML
continues to be used today. However VRML, because it was based on
HTML, is a stodgy language that is not easy to incorporate with
other applications and has been difficult to add features to.
Meanwhile, applications for interactive 3D graphics have been
exploding in areas such as medicine, science, industry, and
entertainment. There is a strong need for a set of modern Web-based
technologies, applied within a standard extensible framework, to
enable a new generation of modeling & simulation applications
to emerge, develop, and interoperate. X3D is the next generation
open standard for 3D on the web. It is the result of several years
of development by the Web 3D Consortium's X3D Task Group. Instead
of a large monolithic specification (like VRML), which requires
full adoption for compliance, X3D is a component-based architecture
that can support applications ranging from a simple non-interactive
animation to the latest streaming or rendering applications. X3D
replaces VRML, but also provides compatibility with existing VRML
content and browsers. Don Brutzman organized the first symposium on
VRML and is playing a similar role with X3D; he is a founding
member of the consortium. Len Daly is a professional member of the
consortium and both Len and Don have been involved with the
development of the standard from the start.
This book presents an overview of the field of multimodal location estimation. The authors' aim is to describe the research results in this field in a unified way. The book describes fundamental methods of acoustic, visual, textual, social graph, and metadata processing as well as multimodal integration methods used for location estimation. In addition, the book covers benchmark metrics and explores the limits of the technology based on a human baseline. The book also outlines privacy implications and discusses directions for future research in the area.
The adoption of ICT for personal and business use has encouraged the growth of interactive learning as well as its application in a number of education and training scenarios. Designing effective syllabi for interactive learning projects helps to ensure that desired leaning outcomes are achieved without incurring a significant loss of time or money. Educational Stages and Interactive Learning: From Kindergarten to Workplace Training provides a record of current research and practical applications in interactive learning. This book reviews all aspects of interactive learning, investigates the history, status, and future trends of interactive learning, introduces emerging technologies for interactive learning, and analyzes interactive learning cases in various educational stages and learning situations. Readers interested in the technologies and pedagogical applications of interactive learning will find this book a comprehensive reference for the understanding of notions, theories, techniques, and methods related to the research and development of interactive learning.
Make-believe plays a far stronger role in both the design and use of interfaces, games and services than we have come to believe. This edited volume illustrates ways for grasping and utilising that connection to improve interaction, user experiences, and customer value. Useful for designers, undergraduates and researchers alike, this new research provide tools for understanding and applying make-believe in various contexts, ranging from digital tools to physical services. It takes the reader through a world of imagination and intuition applied into efficient practice, with topics including the connection of human-computer interaction (HCI) to make-believe and backstories, the presence of imagination in gamification, gameworlds, virtual worlds and service design, and the believability of make-believe based designs in various contexts. Furthermore, it discusses the challenges inherent in applying make-believe as a basis for interaction design, as well as the enactive mechanism behind it. Whether used as a university textbook or simply used for design inspiration, Digital Make-Believe provides new and efficient insight into approaching interaction in the way in which actual users of devices, software and services can innately utilise it.
Nowadays, engineering systems are of ever-increasing complexity and must be c- sidered asmultidisciplinary systems composed of interacting subsystems or system components from different engineering disciplines. Thus, an integration of various engineering disciplines, e.g, mechanical, electrical and control engineering in ac- current design approach is required. With regard to the systematic development and analysis of system models, interdisciplinary computer aided methodologies are - coming more and more important. A graphical description formalism particularly suited for multidisciplinary s- tems arebondgraphs devised by Professor Henry Paynter in as early as 1959 at the Massachusetts Institute of Technology (MIT) in Cambridge, Massachusetts, USA and in use since then all over the world. This monograph is devoted exclusively to the bond graph methodology. It gives a comprehensive, in-depth, state-of-the-art presentation including recent results sc- tered over research articles and dissertations and research contributions by the - thor to a number of topics. The book systematically covers the fundamentals of developing bond graphs and deriving mathematical models from them, the recent developments in meth- ology, symbolic and numerical processing of mathematical models derived from bond graphs. Additionally it discusses modern modelling languages, the paradigm of object-oriented modelling, modern software that can be used for building and for processing of bond graph models, and provides a chapter with small case studies illustrating various applications of the methodology
In fields as diverse as research and development, governance, and international trade, success depends on effective communication. However, limited research exists on how professionals can express themselves consistently across disciplines. Modern Trends Surrounding Information Technology Standards and Standardization within Organizations showcases the far-ranging economic and societal ramifications incited by technical standardization between individuals, organizations, disciplines, and nations. This publication serves as a valuable model for inter-disciplinary scholars, IT researchers, and professionals interested in the link between technology and social change in an increasingly networked and interconnected global society.
This book presents four mathematical essays which explore the foundations of mathematics and related topics ranging from philosophy and logic to modern computer mathematics. While connected to the historical evolution of these concepts, the essays place strong emphasis on developments still to come. The book originated in a 2002 symposium celebrating the work of Bruno Buchberger, Professor of Computer Mathematics at Johannes Kepler University, Linz, Austria, on the occasion of his 60th birthday. Among many other accomplishments, Professor Buchberger in 1985 was the founding editor of the Journal of Symbolic Computation; the founder of the Research Institute for Symbolic Computation (RISC) and its chairman from 1987-2000; the founder in 1990 of the Softwarepark Hagenberg, Austria, and since then its director. More than a decade in the making, Mathematics, Computer Science and Logic - A Never Ending Story includes essays by leading authorities, on such topics as mathematical foundations from the perspective of computer verification; a symbolic-computational philosophy and methodology for mathematics; the role of logic and algebra in software engineering; and new directions in the foundations of mathematics. These inspiring essays invite general, mathematically interested readers to share state-of-the-art ideas which advance the never ending story of mathematics, computer science and logic. Mathematics, Computer Science and Logic - A Never Ending Story is edited by Professor Peter Paule, Bruno Buchberger s successor as director of the Research Institute for Symbolic Computation. "
This book provides insight into the challenges in providing data authentication over wireless communication channels. The authors posit that established standard authentication mechanisms - for wired devices - are not sufficient to authenticate data, such as voice, images, and video over wireless channels. The authors propose new mechanisms based on the so-called soft authentication algorithms, which tolerate some legitimate modifications in the data that they protect. The authors explain that the goal of these algorithms is that they are tolerant to changes in the content but are still able to identify the forgeries. The authors go on to describe how an additional advantage of the soft authentication algorithms is the ability to identify the locations of the modifications and correct them if possible. The authors show how to achieve this by protecting the data features with the help of error correcting codes. The correction methods are typically based on watermarking, as the authors discuss in the book. Provides a discussion of data (particularly image) authentication methods in the presence of noise experienced in wireless communication; Presents a new class of soft authentication methods, instead of the standard hard authentication methods, used to tolerate minor changes in image data; Features authentication methods based on the usage of authentication tags as well as digital watermarks.
The computer gaming industry is bigger than the film and music industries and is growing faster than both of them put together. The industry is also changing fast. The typical computer gamer is in his mid 20s and female gamers make up one of the faster growing parts of the market. New developments in sociability and interactivity are also transforming the industry. This is the first major study of brands and gaming and shows huge opportunities for brand development
This book offers a basic introduction to genetic algorithms. It provides a detailed explanation of genetic algorithm concepts and examines numerous genetic algorithm optimization problems. In addition, the book presents implementation of optimization problems using C and C++ as well as simulated solutions for genetic algorithm problems using MATLAB 7.0. It also includes application case studies on genetic algorithms in emerging fields.
Cognitive Intelligence with Neutrosophic Statistics in Bioinformatics investigates and presents the many applications that have arisen in the last ten years using neutrosophic statistics in bioinformatics, medicine, agriculture and cognitive science. This book will be very useful to the scientific community, appealing to audiences interested in fuzzy, vague concepts from which uncertain data are collected, including academic researchers, practicing engineers and graduate students. Neutrosophic statistics is a generalization of classical statistics. In classical statistics, the data is known, formed by crisp numbers. In comparison, data in neutrosophic statistics has some indeterminacy. This data may be ambiguous, vague, imprecise, incomplete, and even unknown. Neutrosophic statistics refers to a set of data, such that the data or a part of it are indeterminate in some degree, and to methods used to analyze the data.
Tuning of SQL code is generally cheaper than changing the data
model. Physical and configuration tuning involves a search for
bottlenecks that often points to SQL code or data model issues.
Building an appropriate data model and writing properly performing
SQL code can give 100%+ performance improvement. Physical and
configuration tuning often gives at most a 25% performance
increase.
Since its original inception back in 1989 the Web has changed into an environment where Web applications range from small-scale information dissemination applications, often developed by non-IT professionals, to large-scale, commercial, enterprise-planning and scheduling applications, developed by multidisciplinary teams of people with diverse skills and backgrounds and using cutting-edge, diverse technologies. As an engineering discipline, Web engineering must provide principles, methodologies and frameworks to help Web professionals and researchers develop applications and manage projects effectively. Mendes and Mosley have selected experts from numerous areas in Web engineering, who contribute chapters where important concepts are presented and then detailed using real industrial case studies. After an introduction into the discipline itself and its intricacies, the contributions range from Web effort estimation, productivity benchmarking and conceptual and model-based application development methodologies, to other important principles such as usability, reliability, testing, process improvement and quality measurement. This is the first book that looks at Web engineering from a measurement perspective. The result is a self-containing, comprehensive overview detailing the role of measurement and metrics within the context of Web engineering. This book is ideal for professionals and researchers who want to know how to use sound principles for the effective management of Web projects, as well as for courses at an advanced undergraduate or graduate level.
The traditional division of labor between the database (which only
stores and manages SQL and XML data for fast, easy data search and
retrieval) and the application server (which runs application or
business logic, and presentation logic) is obsolete. Although the
books primary focus is on programming the Oracle Database, the
concepts and techniques provided apply to most RDBMS that support
Java including Oracle, DB2, Sybase, MySQL, and PostgreSQL. This is
the first book to cover new Java, JDBC, SQLJ, JPublisher and Web
Services features in Oracle Database 10g Release 2 (the coverage
starts with Oracle 9i Release 2). This book is a must-read for
database developers audience (DBAs, database applications
developers, data architects), Java developers (JDBC, SQLJ, J2EE,
and OR Mapping frameworks), and to the emerging Web Services
assemblers.
Video games are a relative late arrival on the cultural stage. While the academic discipline of game studies has evolved quickly since the nineties of the last century, the academia is only beginning to grasp the intellectual, philosophical, aesthetical, and existential potency of the new medium. The same applies to the question whether video games are (or are not) art in and on themselves. Based on the Communication-Oriented Analysis, the authors assess the plausibility of games-as-art and define the domains associted with this question.
Grid architecture is Oracle s strategy for high-end computing and
RAC is the stepping stone into this arena. This book focuses on
current technology including all valid RAC features up through
Oracle Database 10g Release 2, with a primary focus on deploying it
in a high-end grid environment. The book discusses this technology
at length which users will find beneficial when researching,
implementing or monitoring a RAC environment. The author covers
workshop implementation of services and the distribution of
workload across instances, with threshold definitions and the new
load balancing algorithms. In addition it includes detailed
discussions on ASM that complements the implementation of RAC in
Oracle Grid strategy. The book also includes discussions on new
Oracle Clusterware, its components and its integration with RAC.
I3E 2009 was held in Nancy, France, during September 23-25, hosted by Nancy University and INRIA Grand-Est at LORIA. The conference provided scientists andpractitionersofacademia, industryandgovernmentwithaforumwherethey presented their latest ?ndings concerning application of e-business, e-services and e-society, and the underlying technology to support these applications. The 9th IFIP Conference on e-Business, e-Services and e-Society, sponsored by IFIP WG 6.1. of Technical Committees TC6 in cooperation with TC11, and TC8 represents the continuation of previous events held in Zurich (Switzerland) in 2001, Lisbon (Portugal) in 2002, Sao Paulo (Brazil) in 2003, Toulouse (France) in 2004, Poznan (Poland) in 2005, Turku (Finland) in 2006, Wuhan (China) in 2007 and Tokyo (Japan) in 2008. The call for papers attracted papers from 31 countries from the ?ve con- nents. As a result, the I3E 2009 programo?ered 12 sessions of full-paper pres- tations. The 31 selected papers cover a wide and important variety of issues in e-Business, e-servicesande-society, including security, trust, andprivacy, ethical and societal issues, business organization, provision of services as software and software as services, and others. Extended versions of selected papers submitted to I3E 2009 will be published in the International Journal of e-Adoption and in AIS Transactions on Enterprise Systems. In addition, a 500-euros prize was awarded to the authors of the best paper selected by the Program Comm- tee. We thank all authors who submitted their papers, the Program Committee members and external reviewers for their excellent
Get a firm grip on one of the most popular project management applications on the market today In Microsoft Project Fundamentals: Microsoft Project Standard 2021, Professional 2021, and Project Online Editions, accomplished project management leader Teresa Stover delivers a hands-on introduction to Microsoft's popular project management software filled with real-world examples and plain-language guidance. The book walks you through how to plan, schedule, manage resources, track progress, and more. In the book, you'll: Learn principles and best practices of project management while mastering Microsoft Project capabilities, calculations, and views Understand how task durations, dependencies, and date constraints power the project schedule Manage human, equipment, and material resources, including availability, cost, and task assignments Adjust the project to optimize for the project finish date, budget, and resource allocation Use Microsoft Project to manage waterfall or agile projects Ideal for anyone seeking to unlock the potential of Microsoft's leading project management software for their own project work, Microsoft Project Fundamentals is an essential resource for those new to Microsoft Project and project management, as well as previous users and seasoned project professionals looking for a refresher in the latest features of the newest version of Microsoft Project.
This book provides a quick access to computational tools for algebraic geometry, the mathematical discipline which handles solution sets of polynomial equations. Originating from a number of intense one week schools taught by the authors, the text is designed so as to provide a step by step introduction which enables the reader to get started with his own computational experiments right away. The authors present the basic concepts and ideas in a compact way.
This book includes a short history of interactive narrative and an account of a small group collaboratively authored social media narrative: Romeo and Juliet on Facebook: After Love Comes Destruction. At the forefront of narrative innovation are social media channels - speculative spaces for creating and experiencing stories that are interactive and collaborative. Media, however, is only the access point to the expressiveness of narrative content. Wikis, messaging, mash-ups, and social media (Facebook, Twitter, YouTube and others) are on a trajectory of participatory story creation that goes back many centuries. These forms offer authors ways to create narrative meaning that reflects our current media culture, as the harlequinade reflected the culture of the 18th century, and as the volvelle reflected that of the 13th century. Interactivity, Collaboration, and Authoring in Social Media first prospects the last millennium for antecedents of today's authoring practices. It does so with a view to considering how today's digital manifestations are a continuation, perhaps a reiteration, perhaps a novel pioneering, of humans' abiding interest in interactive narrative. The book then takes the reader inside the process of creating a collaborative, interactive narrative in today's social media through an authoring experience undertaken by a group of graduate students. The engaging mix of blogs, emails, personal diaries , and fabricated documents used to create the narrative demonstrates that a social media environment can facilitate a meaningful and productive collaborative authorial experience and result in an abundance of networked, personally expressive, and visually and textually referential content. The resulting narrative, After Love Comes Destruction, based in Shakespeare's Romeo and Juliet, shows how a generative narrative space evolved around the students' use of social media in ways they had not previously considered both for authoring and for delivery of their final narrative artifact.
This volume features selected contributions on a variety of topics related to linear statistical inference. The peer-reviewed papers from the International Conference on Trends and Perspectives in Linear Statistical Inference (LinStat 2016) held in Istanbul, Turkey, 22-25 August 2016, cover topics in both theoretical and applied statistics, such as linear models, high-dimensional statistics, computational statistics, the design of experiments, and multivariate analysis. The book is intended for statisticians, Ph.D. students, and professionals who are interested in statistical inference.
This book offers a comprehensive explanation of iterated function systems and how to use them in generation of complex objects. Discussion covers the most popular fractal models applied in the field of image synthesis; surveys iterated function system models; explores algorithms for creating and manipulating fractal objects, and techniques for implementing the algorithms, and more. The book includes both descriptive text and pseudo-code samples for the convenience of graphics application programmers. |
![]() ![]() You may like...
Fifty Key Figures in LatinX and Latin…
Paola S. Hernandez, Analola Santana
Hardcover
R3,264
Discovery Miles 32 640
1 Recce: Volume 3 - Onsigbaarheid Is Ons…
Alexander Strachan
Paperback
Vision and Information Processing for…
A. Browne, L. NortonWayne
Hardcover
R5,870
Discovery Miles 58 700
Mathematical Fluid Mechanics - Recent…
Jiri Neustupa, Patrick Penel
Paperback
R2,870
Discovery Miles 28 700
|