![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > Computer software packages
This book is a compilation of a selected subset of research articles presented at the Eighth INFORMS Computing Society Conference, held in Chandler, Arizona, from January 8 to 10, 2003. The articles in this book represent the diversity and depth of the interface between ORiMS (operations research and the management sciences) and CS/AI (computer science and artificial intelligence ). This volume starts with two papers that represent the reflective and integrative thinking that is critical to any scientific discipline. These two articles present philosophical perspectives on computation, covering a variety of traditional and newer methods for modeling, solving, and explaining mathematical models. The next set includes articles that study machine learning and computational heuristics, and is followed by articles that address issues in performance testing of solution algorithms and heuristics. These two sets of papers demonstrate the richness of thought that takes place at the ORiMS and CSI AI interface. The final set of articles demonstrates the usefulness of these and other methods at the interface towards solving problems in the real world, covering e-commerce, workflow, electronic negotiation, music, parallel computation, and telecommunications. The articles in this collection represent the results of cross-fertilization between ORiMS and CSI AI, making possible advances that could have not been achieved in isolation. The continuing aim ofthe INFORMS Computing Society and this research conference is to invigorate and further develop this interface.
The Model-Free Prediction Principle expounded upon in this monograph is based on the simple notion of transforming a complex dataset to one that is easier to work with, e.g., i.i.d. or Gaussian. As such, it restores the emphasis on observable quantities, i.e., current and future data, as opposed to unobservable model parameters and estimates thereof, and yields optimal predictors in diverse settings such as regression and time series. Furthermore, the Model-Free Bootstrap takes us beyond point prediction in order to construct frequentist prediction intervals without resort to unrealistic assumptions such as normality. Prediction has been traditionally approached via a model-based paradigm, i.e., (a) fit a model to the data at hand, and (b) use the fitted model to extrapolate/predict future data. Due to both mathematical and computational constraints, 20th century statistical practice focused mostly on parametric models. Fortunately, with the advent of widely accessible powerful computing in the late 1970s, computer-intensive methods such as the bootstrap and cross-validation freed practitioners from the limitations of parametric models, and paved the way towards the `big data' era of the 21st century. Nonetheless, there is a further step one may take, i.e., going beyond even nonparametric models; this is where the Model-Free Prediction Principle is useful. Interestingly, being able to predict a response variable Y associated with a regressor variable X taking on any possible value seems to inadvertently also achieve the main goal of modeling, i.e., trying to describe how Y depends on X. Hence, as prediction can be treated as a by-product of model-fitting, key estimation problems can be addressed as a by-product of being able to perform prediction. In other words, a practitioner can use Model-Free Prediction ideas in order to additionally obtain point estimates and confidence intervals for relevant parameters leading to an alternative, transformation-based approach to statistical inference.
Understanding how the human brain represents, stores, and processes information is one of the greatest unsolved mysteries of science today. The cerebral cortex is the seat of most of the mental capabilities that distinguish humans from other animals and, once understood, it will almost certainly lead to a better knowledge of other brain nuclei. Although neuroscience research has been underway for 150 years, very little progress has been made. What is needed is a key concept that will trigger a full understanding of existing information, and will also help to identify future directions for research. This book aims to help identify this key concept. Including contributions from leading experts in the field, it provides an overview of different conceptual frameworks that indicate how some pieces of the neuroscience puzzle fit together. It offers a representative selection of current ideas, concepts, analyses, calculations and computer experiments, and also looks at important advances such as the application of new modeling methodologies. Computational Models for Neuroscience will be essential reading for anyone who needs to keep up-to-date with the latest ideas in computational neuroscience, machine intelligence, and intelligent systems. It will also be useful background reading for advanced undergraduates and postgraduates taking courses in neuroscience and psychology.
Learn to crunch huge amounts of data with PowerPivot and Power Query Do you have a ton of data you need to make sense of? Microsoft's Excel program can handle amazingly large data sets, but you'll need to get familiar with PowerPivot and Power Query to get started. And that's where Dummies comes in. With step-by-step instructions--accompanied by ample screenshots--Excel PowerPivot & Power Query For Dummies will teach you how to save time, simplify your processes, and enhance your data analysis and reporting. Use Power Query to discover, connect to, and import your organization's data. Then use PowerPivot to model it in Excel. You'll also learn to: Make use of databases to store large amounts of data Use custom functions to extend and enhance Power Query Add the functionality of formulas to PowerPivot and publish data to SharePoint If you're expected to wrangle, interpret, and report on large amounts of data, Excel PowerPivot & Power Query For Dummies gives you the tools you need to get up to speed quickly.
This book highlights the latest research articles presented at the second Digital Marketing & eCommerce Conference in June 2021. Papers include a diverse set of digital marketing and eCommerce-related topics such as user psychology and behavior in social commerce, influencer marketing in social commerce, social media monetization strategies, social commerce characteristics and their impact on user behavior, branding on social media, social media-based business models, user privacy and security protection on social media, social video marketing and commerce, among other topics.
Computer simulations not only belong to the most important methods for the theoretical investigation of granular materials, but provide the tools that have enabled much of the expanding research by physicists and engineers. The present book is intended to serve as an introduction to the application of numerical methods to systems of granular particles. Accordingly emphasis is on a general understanding of the subject rather than on the presentation of latest advances in numerical algorithms. Although a basic knowledge of C++ is needed for the understanding of the numerical methods and algorithms in the book, it avoids usage of elegant but complicated algorithms to remain accessible for those who prefer to use a different programming language. While the book focuses more on models than on the physics of granular material, many applications to real systems are presented.
This comprehensive book draws together experts to explore how knowledge technologies can be exploited to create new multimedia applications, and how multimedia technologies can provide new contexts for the use of knowledge technologies. Thorough coverage of all relevant topics is given. The step-by-step approach guides the reader from fundamental enabling technologies of ontologies, analysis and reasoning, through to applications which have hitherto had less attention.
This book provides a conceptual and computational framework to study how the nervous system exploits the anatomical properties of limbs to produce mechanical function. The study of the neural control of limbs has historically emphasized the use of optimization to find solutions to the muscle redundancy problem. That is, how does the nervous system select a specific muscle coordination pattern when the many muscles of a limb allow for multiple solutions? I revisit this problem from the emerging perspective of neuromechanics that emphasizes finding and implementing families of feasible solutions, instead of a single and unique optimal solution. Those families of feasible solutions emerge naturally from the interactions among the feasible neural commands, anatomy of the limb, and constraints of the task. Such alternative perspective to the neural control of limb function is not only biologically plausible, but sheds light on the most central tenets and debates in the fields of neural control, robotics, rehabilitation, and brain-body co-evolutionary adaptations. This perspective developed from courses I taught to engineers and life scientists at Cornell University and the University of Southern California, and is made possible by combining fundamental concepts from mechanics, anatomy, mathematics, robotics and neuroscience with advances in the field of computational geometry. Fundamentals of Neuromechanics is intended for neuroscientists, roboticists, engineers, physicians, evolutionary biologists, athletes, and physical and occupational therapists seeking to advance their understanding of neuromechanics. Therefore, the tone is decidedly pedagogical, engaging, integrative, and practical to make it accessible to people coming from a broad spectrum of disciplines. I attempt to tread the line between making the mathematical exposition accessible to life scientists, and convey the wonder and complexity of neuroscience to engineers and computational scientists. While no one approach can hope to definitively resolve the important questions in these related fields, I hope to provide you with the fundamental background and tools to allow you to contribute to the emerging field of neuromechanics.
The importance of knowledge and information technology management has been emphasized both by researchers and practitioners in order for companies to compete in the global market. Now such technologies have become crucial in a sense that there is a need to understand the business and operations strategies, as well as how the development of IT would contribute to knowledge management and therefore increase competitiveness. Knowledge and Information Technology Management: Human and Social Perspectives strives to explore the human resource and social dimensions of knowledge and IT management, to discuss the opportunities and major issues related to the management of people along the supply chain in Internet marketing, and to provide an understanding of how the human resource and the IT management should complement each other for improved communication and competitiveness.
Here is a thorough, not-overly-complex introduction to the three technical foundations for multimedia applications across the Internet: communications (principles, technologies and networking); compressive encoding of digital media; and Internet protocol and services. All the contributing systems elements are explained through descriptive text and numerous illustrative figures; the result is a book well-suited toward non-specialists, preferably with technical background, who need well-composed tutorial introductions to the three foundation areas. The text discusses the latest advances in digital audio and video encoding, optical and wireless communications technologies, high-speed access networks, and IP-based media streaming, all crucial enablers of the multimedia Internet.
With the fast growth ofmultimedia information, content-based video anal- ysis, indexing and representation have attracted increasing attention in re- cent years. Many applications have emerged in these areas such as video- on-demand, distributed multimedia systems, digital video libraries, distance learning/education, entertainment, surveillance and geographical information systems. The need for content-based video indexing and retrieval was also rec- ognized by ISOIMPEG, and a new international standard called "Multimedia Content Description Interface" (or in short, MPEG-7)was initialized in 1998 and finalized in September 2001. In this context, a systematic and thorough review ofexisting approaches as well as the state-of-the-art techniques in video content analysis, indexing and representation areas are investigated and studied in this book. In addition, we will specifically elaborate on a system which analyzes, indexes and abstracts movie contents based on the integration ofmultiple media modalities. Content ofeach part ofthis book is briefly previewed below. In the first part, we segment a video sequence into a set ofcascaded shots, where a shot consistsofone or more continuouslyrecorded image frames. Both raw and compressedvideo data will beinvestigated. Moreover, consideringthat there are always non-story units in real TV programs such as commercials, a novel commercial break detection/extraction scheme is developed which ex- ploits both audio and visual cues to achieve robust results. Specifically, we first employ visual cues such as the video data statistics, the camera cut fre- quency, and the existenceofdelimiting black frames between commercials and programs, to obtain coarse-level detection results.
This volume comprises eight well-versed contributed chapters devoted to report the latest findings on the intelligent approaches to multimedia data analysis. Multimedia data is a combination of different discrete and continuous content forms like text, audio, images, videos, animations and interactional data. At least a single continuous media in the transmitted information generates multimedia information. Due to these different types of varieties, multimedia data present varied degrees of uncertainties and imprecision, which cannot be easy to deal by the conventional computing paradigm. Soft computing technologies are quite efficient to handle the imprecision and uncertainty of the multimedia data and they are flexible enough to process the real-world information. Proper analysis of multimedia data finds wide applications in medical diagnosis, video surveillance, text annotation etc. This volume is intended to be used as a reference by undergraduate and post graduate students of the disciplines of computer science, electronics and telecommunication, information science and electrical engineering. THE SERIES: FRONTIERS IN COMPUTATIONAL INTELLIGENCE The series Frontiers In Computational Intelligence is envisioned to provide comprehensive coverage and understanding of cutting edge research in computational intelligence. It intends to augment the scholarly discourse on all topics relating to the advances in artifi cial life and machine learning in the form of metaheuristics, approximate reasoning, and robotics. Latest research fi ndings are coupled with applications to varied domains of engineering and computer sciences. This field is steadily growing especially with the advent of novel machine learning algorithms being applied to different domains of engineering and technology. The series brings together leading researchers that intend to continue to advance the fi eld and create a broad knowledge about the most recent state of the art.
The third entry in the Jim Blinn's Corner series, this is, like the
others, a handy compilation of selected installments of his
influential column. But here, for the first time, you get the
"Director's Cut" of the articles: revised, expanded, and enhanced
versions of the originals. What's changed? Improved mathematical
notation, more diagrams, new solutions. What remains the same? All
the things you've come to rely on: straight answers, irreverent
style, and innovative thinking. This is Jim Blinn at his best now
even better.
OCS is becoming the dominant product for large-scale collaborationblending core functions of e-mail, file serving, and diary management with the additional functionality of web conferencing, instant messaging, and wireless access. Pro Oracle Collaboration Suite 10g provides all you need to know to install and configure OCS for use, but this book is much more than a to-do list. It covers the architecture of the server processes and the applications, giving you the theoretical background to take OCS beyond the basics. Theres no need to worry if youre new to the Oracle database, Oracle Components for Java environment (OC4J), HTTP web servers, or LDAP Internet directorieseverything is explained carefully. But if you are already familiar with these topics, youll learn how to fully exploit them in order to optimize your OCS installation.
Few industries fit the description of high turbulence and high velocity better than the computer games industry. Relatively young, rapidly evolving, and frequently experiencing disruptive innovation, its potential for growth and new business opportunities seems barely exhausted. And indeed, in the current economic climate this industry seems positively recession-proof. Those making and those playing games use digital technology to share an enthusiasm for the industry's products that we rarely find elsewhere. This fascination with computer games and gaming and the economic significance of the industry make it one of the most remarkable socio-cultural phenomena of our world. As the industry emerges, we become more aware of the need to trace it and to understand it in all its meanings, and the challenges it poses. The 10 chapters in this book provide an examination of the computer games industry from 10 different perspectives, discussing the following aspects: -The spatial logic of the industry-Business model innovation-Games development - a risky business-Co-production and the role of the consumer-Business sustainability-The place of creativity-Emerging people management challenges-Violent games and work well-being-A critical perspective on games as phantasmagoric commodities-Virtual worlds - blurring boundaries between realities and games
Even as developments in photorealistic computer graphics continue
to affect our work and leisure activities, practitioners and
researchers are devoting more and more attention to
non-photorealistic (NPR) techniques for generating images that
appear to have been created by hand. These efforts benefit every
field in which illustrations thanks to their ability to clarify,
emphasize, and convey very precise meanings offer advantages over
photographs. These fields include medicine, architecture,
entertainment, education, geography, publishing, and visualization.
Research in the field of multimedia metadata is especially challenging: Lots of scientific publications and reports on research projects are published every year and the range of possible applications is diverse and huge. This book gives an overview on fundamental issues within the field of multimedia metadata focusing on contextualized, ubiquitous, accessible and interoperable services on a higher semantic level. The book in hand provides a selection of basic articles being a base for multimedia metadata research. Furthermore it presents a view on the current state of the art in multimedia metadata research. It provides information from versatile applications domains (Broadcasting, Interactive TV, E-Learning and Social Software) such as: Multimedia on the Web 2.0 - Databases for Multimedia (Meta-)Data - Multimedia Information Retrieval and Evaluation - Multimedia Metadata Standards - Ontologies for Multimedia. The multimedia metadata community (www.multimedia-metadata.info), wherefrom this book originated, brings together experts from research and industry in the area of multimedia metadata research and application development. The community bridges the gap between an academic research and an industrial scale development of innovative products. By summarizing the work of the community this book contributes to the aforementioned fields by addressing these topics for a broad range of readers.
* With Oracle 10g, for the first time, much of the Spatial functionality is provided for free (rather than as a priced option) in the database, thus massively increasing the potential audience. * Shows how any Oracle application that has a spatial element (e.g. postcode) can take advantage of Spatial functionality. * Contains case studies of more advanced applications of Spatial in healthcare, telecom ,retail, and distribution . * Oracle Spatial is recognized to be the standard platform for enterprise land management, mapping, telecom, transportation, and utility applications. Every major GIS tool vendor supports Oracle Spatial and all major map data providers deliver their data in Oracle Spatial format. * The book will be based on extensive feedback from training courses, discussion lists, and customers. It will recommend best practice approaches to the most common problems with which developers struggle. * The authors are all experienced and well-respected experts. The Oracle personnel contributing have a decade of experience with Spatial and in helping partners and customers fully leverage its capabilities. The technical reviewers include lead developers of the product. * Rather than simplified code snippets, the book provides real solutions that people can then build upon themselves.
This book brings together some of the most influential pieces of research undertaken around the world in design synthesis. It is the first comprehensive work of this kind and covers all three aspects of research in design synthesis:- understanding what constitutes and influences synthesis;- the major approaches to synthesis;- the diverse range of tools that are created to support this crucial design task.The chapters are comprised of cutting edge research and established methods, written by the originators of this growing field of research. They cover all major generic synthesis approaches i.e., composition, retrieval, change and repair, and tackle problems that come from a wide variety of domains within architecture and engineering as well as areas of application including clocks, sensors and medical devices. The book contains an editorial introduction to the chapters and the broader context of research they represent. With its range of tools and methods covered, it is an ideal introduction to design synthesis for those intending to research in this area as well as being a valuable source of ideas for educators and practitioners of engineering design.
The first edition (94301-3) was published in 1995 in TIMS and had 2264 regular US sales, 928 IC, and 679 bulk. This new edition updates the text to Mathematica 5.0 and offers a more extensive treatment of linear algebra. It has been thoroughly revised and corrected throughout. |
![]() ![]() You may like...
Topology Optimization - Theory, Methods…
Martin Philip Bendsoe, Ole Sigmund
Hardcover
R3,978
Discovery Miles 39 780
Complements of Higher Mathematics
- Marin Marin, Andreas Oechsner
Hardcover
R2,924
Discovery Miles 29 240
Disciple - Walking With God
Rorisang Thandekiso, Nkhensani Manabe
Paperback
![]()
Spatial Patterns - Higher Order Models…
L.A. Peletier, W.C. Troy
Hardcover
R1,731
Discovery Miles 17 310
|