![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > Computer software packages > Computer graphics software
In recent years, it has become apparent that knowing the average atomic structure of materials is insufficient to understand their properties. Diffuse scattering in addition to the Bragg scattering holds the key to learning about defects in materials, the topic of many recent books. What has been missing is a detailed step-by-step guide on how to simulate disordered materials. The DISCUS cook book fills this need covering simple topics such as building a computer crystal to complex topics such as domain structures, stacking faults or using advanced refinement techniques to adjust parameters on a disordered model. The book contains a CD-ROM with all files needed to recreate every example given using the program DISCUS. The reader is free to follow the principles behind simulating disordered materials or to get down into the details and run or modify the given examples.
From climate change forecasts and pandemic maps to Lego sets and Ancestry algorithms, models encompass our world and our lives. In her thought-provoking new book, Annabel Wharton begins with a definition drawn from the quantitative sciences and the philosophy of science but holds that history and critical cultural theory are essential to a fuller understanding of modeling. Considering changes in the medical body model and the architectural model, from the Middle Ages to the twenty-first century, Wharton demonstrates the ways in which all models are historical and political. Examining how cadavers have been described, exhibited, and visually rendered, she highlights the historical dimension of the modified body and its depictions. Analyzing the varied reworkings of the Holy Sepulchre in Jerusalem-including by monumental commanderies of the Knights Templar, Alberti's Rucellai Tomb in Florence, Franciscans' olive wood replicas, and video game renderings-she foregrounds the political force of architectural representations. And considering black boxes-instruments whose inputs we control and whose outputs we interpret, but whose inner workings are beyond our comprehension-she surveys the threats posed by such opaque computational models, warning of the dangers that models pose when humans lose control of the means by which they are generated and understood. Engaging and wide-ranging, Models and World Making conjures new ways of seeing and critically evaluating how we make and remake the world in which we live.
Creativity has been integral to the development of the modern State, and yet it is becoming increasingly sidelined, especially as a result of the development of new machinic technologies including 3D printing. Arguing that inner creativity has been endangered by the rise of administrative regulation, James Griffin explores a number of reforms to ensure that upcoming regulations do take creativity into account. The State of Creativity examines how the State has become distanced from individual processes of creativity. This book investigates how the failure to incorporate creativity into administrative regulation is, in fact, adversely impacting the regulation of new technologies such as 3D and 4D printing and augmented reality, by focusing on issues concerning copyright and patents. This is an important read for intellectual property law scholars, as well as those studying computer science who wish to gain a more in-depth understanding of the current laws surrounding digital technologies such as 3D printing in our modern world. Legal practitioners wanting to remain abreast of developments surrounding 3D printing will also benefit from this book.
The science of simulation and modeling (SM) is multifaceted and complex due to the numerous applications involved, particularly since SM applications range from nuclear reaction to supermarket queuing. Simulation and Modeling: Current Technologies and Applications offers insight into the computer science aspect of simulation and modeling while integrating the business practices of SM.Simulation and Modeling: Current Technologies and Applications includes current issues related to simulation, such as: Web-based simulation, virtual reality, augmented reality, and artificial intelligence. This book depicts different methods, views, theories, and applications of simulations in one volume.
In the last decade there has been a phenomenal growth in interest in crime pattern analysis. Geographic information systems are now widely used in urban police agencies throughout industrial nations. With this, scholarly interest in understanding crime patterns has grown considerably. ""Artificial Crime Analysis Systems: Using Computer Simulations and Geographic Information Systems"" discusses leading research on the use of computer simulation of crime patterns to reveal hidden processes of urban crimes, taking an interdisciplinary approach by combining criminology, computer simulation, and geographic information systems into one comprehensive resource.
This unique, new book covers the whole field of electronic warfare modeling and simulation at a systems level, including chapters that describe basic electronic warfare (EW) concepts. Written by a well-known expert in the field with more than 24 years of experience, the book explores EW applications and techniques and the radio frequency spectrum. A detailed resource for entry-level engineering personnel in EW, military personnel with no radio or communications engineering background, technicians and software professionals, the work explains the basic concepts required for modeling and simulation that today's professionals need to understand. Practitioners find clear explanations of important mathematical concepts, such as decibel notation and spherical trigonometry, necessary for modeling and simulation. Moreover, the book describes specific types of EW equipment, how they work and how each is mathematically modeled.
Economists are increasingly using computer simulations to understand the implications of their theoretical models and to make policy recommendations. New model solution techniques are required to deal with the increasingly important role of dynamics and uncertainty in macroeconomics. This book consists of articles by leading contributors in the field showing how to use these techniques in the context of standard macroeconomic models.
Enterprise Modeling and Computing with UML bridges two fields that are closely related, but are often studied in isolation: enterprise modeling and information systems modeling. The principal idea is to use a standard language for modeling information systems, UML, as a catalyst, and investigate its potential for modeling enterprises. ""Enterprise Modeling and Computing with UML"" shows both the potential and the limit of using UML in an enterprise modeling context and a broad spectrum of ideas for aligning the development of information systems with the management of an enterprise.
Recent decades have seen a very rapid success in developing
numerical methods based on explicit control over approximation
errors. It may be said that nowadays a new direction is forming in
numerical analysis, the main goal of which is to develop methods
ofreliable computations. In general, a reliable numerical method
must solve two basic problems: (a) generate a sequence of
approximations that converges to a solution and (b) verify the
accuracy of these approximations. A computer code for such a method
must consist of two respective blocks: solver and checker.
This handbook serves as a comprehensive, systematic reference to the major mathematical models used in radio engineering and communications, and presents computer simulation algorithms to help the reader estimate parameters of radio systems. It provides the technical details necessary to design and analyze radar, communication, radio navigation, radio control, electronic intelligence and electronic warfare systems. Mathcad routines, cited in the handbook, should help the reader to optimize radar system performance analysis, and can be used to create custom-made software that better answers specific needs.
Parallel CFD 2000, the Twelfth in an International series of
meetings featuring computational fluid dynamics research on
parallel computers, was held May 22-25, 2000 in Trondheim, Norway.
Evolutionary models (e.g genetic algorithms, artificial life) are emerging as an important new tool for geographic information systems for a number of reasons. First, they are highly appropriate for modelling geographic phenomena; second, geographical problems are often spatially separate (broken down into logical or regional problems), and evolutionary algorithms can exploit this structure; and finally, the ability to store, mainipulate, and visualize spatial data has increased to the point that space-time attribute databases can be easily handled. This book is proposed to serve as a guide to the evolutionary modelling of spatial phenomena.
The Global Forest Products Model (GFPM) book provides a complete
introduction to this widely applied computer model. The GFPM is a
dynamic economic equilibrium model that is used to predict
production, consumption, trade, and prices of 14 major forest
products in 180 interacting countries. The book thoroughly
documents the methods, data, and computer software of the model,
and demonstrates the model's usefulness in addressing international
economic and environmental issues.
Computer programs that simulate complex processes in the real world can provide a quantitative tool for determining how much debt can be added safely to a company's capital structure. The increasing number of bankruptcies and defaults in today's international business arena result from debt overload and point to major shortcomings in the conventional financial evaluation process. In this book, Roy L. Nersesian describes why current methods of risk management fail and how computer simulation can be employed to determine the safe level of debt more accurately. Because the decision to add debt to an organization requires favorable, and essentially independent, decisions from both the borrower and lender, it is necessary to quantify both perspectives. Through actual examples readers will learn how to do this and to translate an actual business situation into a simulation model or program. Current evaluation systems, according to Nersesian, fail to incorporate the cyclical nature of business activity. They result all too often in an overly optimistic projection of cash flow. Simulation techniques are better able to incorporate the transience of good times and put quantitative analysis of risk on par with quantitative analysis of reward. Simulation techniques also reduce the role of speculative, and highly subjective, judgment. For example, decisionmakers who are not familiar personally with a particular business area, assign more risk to that area than those who are. A quantified risk management system enables executives to rank projects by the degree of risk much as they currently rank them by degree of profitability. The book presents the concept of simulation in terms that can be understood by generalists in corporations and financial institutions. At the same time, it provides computer programmers with an understanding of risk management principles. It will provide a valuable resource for: financial executives, planners and strategists in corporate and governmental organizations; bank lending officers; and computer programmers working with these organizations.
Volume 37 is concerned with the use and role of modelling in
chemical kinetics and seeks to show the interplay of theory or
simulation with experiment in a diversity of physico-chemical areas
in which kinetics measurements provide significant physical
insight. Areas of application covered within the volume include
electro- and interfacial chemistry, physiology, biochemistry, solid
state chemistry and chemical engineering.
Praise for Previous Volumes
This volume covers the integration of fuzzy logic and expert
systems. A vital resource in the field, it includes techniques for
applying fuzzy systems to neural networks for modeling and control,
systematic design procedures for realizing fuzzy neural systems,
techniques for the design of rule-based expert systems using the
massively parallel processing capabilities of neural networks, the
transformation of neural systems into rule-based expert systems,
the characteristics and relative merits of integrating fuzzy sets,
neural networks, genetic algorithms, and rough sets, and
applications to system identification and control as well as
nonparametric, nonlinear estimation. Practitioners, researchers,
and students in industrial, manufacturing, electrical, and
mechanical engineering, as well as computer scientists and
engineers will appreciate this reference source to diverse
application methodologies.
The increased computational power and software tools available to
engineers have increased the use and dependence on modeling and
computer simulation throughout the design process. These tools have
given engineers the capability of designing highly complex systems
and computer architectures that were previously unthinkable. Every
complex design project, from integrated circuits, to aerospace
vehicles, to industrial manufacturing processes requires these new
methods. This book fulfills the essential need of system and
control engineers at all levels in understanding modeling and
simulation. This book, written as a true text/reference has become
a standard sr./graduate level course in all EE departments
worldwide and all professionals in this area are required to update
their skills. * Presents a working foundation necessary for compliance with
High Level Architecture (HLA) standards
Revitalize your architectural visualizations by bringing new levels of realism to your day and night interior scenes. This book features full-color, step-by-step tutorials to develop a firm understanding of the processes and techniques involved in creating jaw-dropping 3d visualizations for top marketing agencies. This second volume includes day and night lighting of an atrium scene using seasoned tools and techniques to deploy V-Ray 5 and 3ds Max 2020. It has never been quicker and easier to create the industry's top-of-the-range 3d marketing visuals. The book starts with an overview of the best techniques to approach clients via emails, calls, meetings, and social media. There are also key insights into the best practices of handling projects, pricing, contracts, invoices, pre-production, production, post-production, etc. The subsequent step takes users through the installation of V-Ray 5 and the process of accessing the V-Ray Material browser dialog. Throughout the book, users are taken through VRayMtl functions such as Diffuse, Roughness, Reflect, Glossiness, Metalness, Refract, IOR, Abbe number, Fog color, Translucency, BRDF, Coat, Sheen and Bump. Users will also learn how to use procedural maps such as VRayBitmap, VRayTriplanarTex, Bricks, Metals, Carpaint, VRayDisplacementMod, VRayUVWRandomizer, VRayMultiSubTex, VRayPointCloudColor, VRayDirt, VRayAerialPersepective, VRayLightMtl, VRayMtlWrapper, VRayOverrideMtl, VRay2SidedMtl, VRayBlendMtl and VRayEdgesTex. Users will have a rare insight into all functionalities of a V-Ray camera, VRayLight objects, Render settings, Frame buffer, Global switches, IPR options, Bucket and Progressive image samplers, Image filters, Global DMC, Color mapping, Brute force GI, Light cache, Color management, Distributed rendering, Render elements, V-Ray image file format, VFB History settings, VFB Lens Effects, LightMix, Film tonemap, Hue/Saturation, Lookup Table and much more. Finally, users will embark on the amazing journey of utilizing the previous chapters to create eye-catching 3d marketing visuals through the meticulous process of pre-production, production and post-production of both day and night scenes/lighting. The tips and tricks section will extensively cover key sections about Verified views for planning applications, Parametric modeling with AdvArray, anima (R), project manager plug-in, Verge3d, Webrotate 360, Accucities 3d city models and much more.
The 2nd edition of Chopra's Google SketchUp provides key pedagogical elements, which help prepare readers for the workforce. The content provides real-world and applied material including better PowerPoint presentations and how-to animations. Additional features include updated content to reflect software upgrades and market use; new pedagogy elements and interior design; and more robust resources that will are appropriate for different users of Google Sketch. The book also addresses the similarities between the adapted title, Google SketchUp 8 for Dummies, and Google SketchUp 2. This includes a title that contains the core content and basic software how-to from For Dummies; revised TOC to reflect the course; and new material developed/written by writer and academic advisors/reviewers. This edition goes beyond the basic software use to teach on portions of SketchUp.
This book has the unique intention of returning the mathematical tools of neural networks to the biological realm of the nervous system, where they originated a few decades ago. It aims to introduce, in a didactic manner, two relatively recent developments in neural network methodology, namely recurrence in the architecture and the use of spiking or integrate-and-fire neurons. In addition, the neuro-anatomical processes of synapse modification during development, training, and memory formation are discussed as realistic bases for weight-adjustment in neural networks. While neural networks have many applications outside biology, where it is irrelevant precisely which architecture and which algorithms are used, it is essential that there is a close relationship between the network's properties and whatever is the case in a neuro-biological phenomenon that is being modelled or simulated in terms of a neural network. A recurrent architecture, the use of spiking neurons and appropriate weight update rules contribute to the plausibility of a neural network in such a case. Therefore, in the first half of this book the foundations are laid for the application of neural networks as models for the various biological phenomena that are treated in the second half of this book. These include various neural network models of sensory and motor control tasks that implement one or several of the requirements for biological plausibility. |
![]() ![]() You may like...
|