![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Applications of computing > General
There are many challenges facing organizations today as they incorporate electronic marketing methods into their strategy. Advances in Electronic Marketing examines these challenges within three major themes: the global environment, the strategic/technological realm, and the buyer behavior of online consumers. Each chapter raises important issues, practical applications, and relevant solutions for the electronic marketer. Advances in Electronic Marketing not only addresses Internet marketing and the World Wide Web, but also other electronic marketing tools, such as geographic information systems, database marketing, and mobile advertising. This book provides researchers and practitioners with an updated source of knowledge on electronic marketing methods.
This book aims at providing a view of the current trends in the development of research on Synthesis and Control of Discrete Event Systems. Papers col lected in this volume are based on a selection of talks given in June and July 2001 at two independent meetings: the Workshop on Synthesis of Concurrent Systems, held in Newcastle upon Tyne as a satellite event of ICATPN/ICACSD and organized by Ph. Darondeau and L. Lavagno, and the Symposium on the Supervisory Control of Discrete Event Systems (SCODES), held in Paris as a satellite event of CAV and organized by B. Caillaud and X. Xie. Synthesis is a generic term that covers all procedures aiming to construct from specifications given as input objects matching these specifications. The ories and applications of synthesis have been studied and developped for long in connection with logics, programming, automata, discrete event systems, and hardware circuits. Logics and programming are outside the scope of this book, whose focus is on Discrete Event Systems and Supervisory Control. The stress today in this field is on a better applicability of theories and algorithms to prac tical systems design. Coping with decentralization or distribution and caring for an efficient realization of the synthesized systems or controllers are of the utmost importance in areas so diverse as the supervision of embedded or man ufacturing systems, or the implementation of protocols in software or in hard ware."
Parallel and distributed computing is one of the foremost
technologies for shaping New Horizons of Parallel and Distributed Computing is a collection of self-contained chapters written by pioneering researchers to provide solutions for newly emerging problems in this field. This volume will not only provide novel ideas, work in progress and state-of-the-art techniques in the field, but will also stimulate future research activities in the area of parallel and distributed computing with applications. New Horizons of Parallel and Distributed Computing is intended for industry researchers and developers, as well as for academic researchers and advanced-level students in computer science and electrical engineering. A valuable reference work, it is also suitable as a textbook.
This text looks at how computers are being used in primary classrooms and how they could be used better. Its three sections focus upon: how do we investigate learning through talk around computers? What affects the quality of group work around computers? What can teachers do to improve this?
Images have always been very important in human life. Their applications range from primitive communication between humans of all ages to advanced technologies in the industrial, medical and military field. The increased possibilities to capture and analyze images have contributed to the largeness that the scientific field of "image processing" has become today. Many techniques are being applied, including soft computing. "Soft Computing in Image Processing: Recent Advances" follows the edited volumes "Fuzzy Techniques in Image Processing" (volume 52, published in 2000) and "Fuzzy Filters for Image Processing" (volume 122, published in 2003), and covers a wide range of both practical and theoretical applications of soft computing in image processing. The 16 excellent chapters of the book have been grouped into five parts: Applications in Remote Sensing, Applications in Image Retrieval, Applications in Image Analysis, Other Applications, and Theoretical Contributions. The focus of the book is on practical applications, which makes it interesting for every researcher that is involved with soft computing, image processing, or both scientific branches.
From the Foreword..... Modern digital signal processing applications provide a large challenge to the system designer. Algorithms are becoming increasingly complex, and yet they must be realized with tight performance constraints. Nevertheless, these DSP algorithms are often built from many constituent canonical subtasks (e.g., IIR and FIR filters, FFTs) that can be reused in other subtasks. Design is then a problem of composing these core entities into a cohesive whole to provide both the intended functionality and the required performance. In order to organize the design process, there have been two major approaches. The top-down approach starts with an abstract, concise, functional description which can be quickly generated. On the other hand, the bottom-up approach starts from a detailed low-level design where performance can be directly assessed, but where the requisite design and interface detail take a long time to generate. In this book, the authors show a way to effectively resolve this tension by retaining the high-level conciseness of VHDL while parameterizing it to get good fit to specific applications through reuse of core library components. Since they build on a pre-designed set of core elements, accurate area, speed and power estimates can be percolated to high- level design routines which explore the design space. Results are impressive, and the cost model provided will prove to be very useful. Overall, the authors have provided an up-to-date approach, doing a good job at getting performance out of high-level design. The methodology provided makes good use of extant design tools, and is realistic in terms of the industrial design process. The approach is interesting in its own right, but is also of direct utility, and it will give the existing DSP CAD tools a highly competitive alternative. The techniques described have been developed within ARPAs RASSP (Rapid Prototyping of Application Specific Signal Processors) project, and should be of great interest there, as well as to many industrial designers. Professor Jonathan Allen, Massachusetts Institute of Technology
The emergence and widespread use of personal computers and network technologies have seen the development of interest in the use of computers to support cooperative work. This volume presents the proceedings of the tenth European conference on Computer Supported Cooperative Work (CSCW). This is a multidisciplinary area that embraces the development of new technologies grounded in actual cooperative practices. These proceedings contain a collection of papers addressing novel interaction technologies for CSCW systems, new models and architectures for groupware systems, studies of communication and coordination among mobile actors, studies of cooperative work in complex settings, studies of groupware systems in actual use in real-world settings, and theories and techniques to support the development of cooperative applications. The papers present emerging technologies alongside new methods and approaches to the development of this important class of applications.
Document Processing and Retrieval: TEXPROS focuses on the design and implementation of a personal, customizable office information and document processing system called TEXPROS (a TEXt PROcessing System). TEXPROS is a personal, intelligent office information and document processing system for text-oriented documents. This system supports the storage, classification, categorization, retrieval and reproduction of documents, as well as extracting, browsing, retrieving and synthesizing information from a variety of documents. When using TEXPROS in a multi-user or distributed environment, it requires specific protocols for extracting, storing, transmitting and exchanging information. The authors have used a variety of techniques to implement TEXPROS, such as Object-Oriented Programming, Tcl/Tk, X-Windows, etc. The system can be used for many different purposes in many different applications, such as digital libraries, software documentation and information delivery. Audience: Provides in-depth, state-of-the-art coverage of information processing and retrieval, and documentation for such professionals as database specialists, information systems and software developers, and information providers.
This volume is a how-to guide to the use of computers in library-based adult literacy programs. Since the commitment to literacy training has become an integral part of libraries' efforts to offer equal access to information, Linda Main and Char Whitaker provide a comprehensive study of the efficacious role the computer can play in achieving this objective. The problems and successes associated with the introduction of computers into library literacy programs, as well as financial requirements, space, furniture, training, and the effect on other library operations are central to the study. The text also features a design for an ideal computerized literacy lab, an overview of compatible software, both existing and proposed, and a look at the rewards and challenges facing librarians, professional educators, and literacy program directors in the future. Appendixes provide country-wide information on libraries currently involved in automating literacy, main suppliers of literacy software, and consulting personnel.
Identifying Emerging Trends in Technological Innovation Doctoral programs in science and engineering are important sources of innovative ideas and techniques that might lead to new products and technological innovation. Certainly most PhD students are not experienced researchers and are in the process of learning how to do research. Nevertheless, a number of empiric studies also show that a high number of technological innovation ideas are produced in the early careers of researchers. The combination of the eagerness to try new approaches and directions of young doctoral students with the experience and broad knowledge of their supervisors is likely to result in an important pool of innovation potential. The DoCEIS doctoral conference on Computing, Electrical and Industrial En- neering aims at creating a space for sharing and discussing ideas and results from doctoral research in these inter-related areas of engineering. Innovative ideas and hypotheses can be better enhanced when presented and discussed in an encouraging and open environment. DoCEIS aims to provide such an environment, releasing PhD students from the pressure of presenting their propositions in more formal contexts.
Tabu Search (TS) and, more recently, Scatter Search (SS) have proved highly effective in solving a wide range of optimization problems, and have had a variety of applications in industry, science, and government. The goal of Metaheuristic Optimization via Memory and Evolution: Tabu Search and Scatter Search is to report original research on algorithms and applications of tabu search, scatter search or both, as well as variations and extensions having "adaptive memory programming" as a primary focus. Individual chapters identify useful new implementations or new ways to integrate and apply the principles of TS and SS, or that prove new theoretical results, or describe the successful application of these methods to real world problems.
In probability and statistics we often have to estimate probabilities and parameters in probability distributions using a random sample. Instead of using a point estimate calculated from the data we propose using fuzzy numbers which are constructed from a set of confidence intervals. In probability calculations we apply constrained fuzzy arithmetic because probabilities must add to one. Fuzzy random variables have fuzzy distributions. A fuzzy normal random variable has the normal distribution with fuzzy number mean and variance. Applications are to queuing theory, Markov chains, inventory control, decision theory and reliability theory.
The book presents topics in discrete biomathematics. Mathematics has been widely used in modeling biological phenomena. However, the molecular and discrete nature of basic life processes suggests that their logic follow principles that are intrinsically based on discrete and informational mechanisms. The ultimate reason of polymers, as key element of life, is directly based on the computational power of strings, and the intrinsic necessity of metabolism is related to the mathematical notion of multiset. The switch of the two roots of bioinformatics suggests a change of perspective. In bioinformatics, the biologists ask computer scientists to assist them in processing biological data. Conversely, in infobiotics mathematicians and computer scientists investigate principles and theories yielding new interpretation keys of biological phenomena. Life is too important to be investigated by biologists alone, and though computers are essential to process data from biological laboratories, many fundamental questions about life can be appropriately answered by a perspicacious intervention of mathematicians, computer scientists, and physicists, who will complement the work of chemists, biochemists, biologists, and medical investigators. The volume is organized in seven chapters. The first part is devoted to research topics (Discrete information and life, Strings and genomes, Algorithms and Biorhythms, Life Strategies), the second one to mathematical backgrounds (Numbers and Measures, Languages and Grammars, Combinations and Chances).
The International Federation for Information Processing (IFIP) is a non-profit umbrella organization for national societies working in the field of information processing. It was founded in 1960 under the auspices of UNESCO. It is organized into several technical committees. This book represents the proceedings of the 2006 conference of technical committee 8 (TC8), which covers the field of information systems. This conference formed part of IFIP's World Computer Congress in Chile. The occasion celebrated the 30th anniversary of IFIP TC8 by looking at the past, present and future of information systems. The proceedings reflect not only the breadth and depth of the work of TC8, but also the international nature of the group, with authors from 18 countries being represented in the 21 papers (including two invited papers) and 2 panels. All submissions were rigorously refereed by at least two reviewers and an associate editor and following the review and resubmission process nearly 50% of submissions were accepted. This paper introduces the papers and panels presented at the conference and published in this volume. It is never straightforward to classify a set of papers but we have made an attempt and this classification is also reflected in the sessions of the conference itself. The classification for the papers is as follows: the world of information systems - early pioneers; developing improved information systems; information systems in their domains of application; the discipline of information systems; issues of production; IT impacts on the organization; tools and modeling and new directions.
Field-Programmable Gate Arrays (FPGAs) have emerged as an attractive means of implementing logic circuits, providing instant manufacturing turnaround and negligible prototype costs. They hold the promise of replacing much of the VLSI market now held by mask-programmed gate arrays. FPGAs offer an affordable solution for customized VLSI, over a wide variety of applications, and have also opened up new possibilities in designing reconfigurable digital systems. Field-Programmable Gate Arrays discusses the most important aspects of FPGAs in a textbook manner. It provides the reader with a focused view of the key issues, using a consistent notation and style of presentation. It provides detailed descriptions of commercially available FPGAs and an in-depth treatment of the FPGA architecture and CAD issues that are the subjects of current research. The material presented is of interest to a variety of readers, including those who are not familiar with FPGA technology, but wish to be introduced to it, as well as those who already have an understanding of FPGAs, but who are interested in learning about the research directions that are of current interest.
The five digital forces (mobility and pervasive computing, cloud, big data, artificial intelligence and robotics, and social media) are poised to bring great academic and industrial breakthroughs. All stakeholders want to understand how to best harness these forces to their advantage. While literature exists for understanding each force independently, there is a lack of knowledge on how to utilize all the forces together to realize future enterprises. Advanced Digital Architectures for Model-Driven Adaptive Enterprises is an essential reference source that explores the potential in unifying the five digital forces to achieve increased levels of agility, efficiency, and scale. Featuring coverage on a wide range of topics including socio-technical systems, adaptive architectures, and enterprise modeling, this book is ideally designed for managers, executives, programmers, designers, computer engineers, entrepreneurs, tool builders, digital practitioners, researchers, academicians, ands students at the graduate level.
Whilst Information Systems has the potential to widen our view of the world, it often has the opposite effect by limiting our ability to interact, facilitating managerial and state surveillance or instituting strict hierarchies and personal control. In this book, Bernd Stahl offers an alternative and critical perspective on the subject, arguing that the ongoing problems in this area could be caused by the misconceptualization of the nature and role of IS. Stahl discusses the question of how IS can be used to actually overcome oppression and promote emancipation, breaking the book into four sections. The first section covers the theory of critical research in IS, giving a central place for the subject of ethics. The second section discusses the philosophical underpinnings of this critical research. The third and largest section gives examples of the application of critical work in IS. The final section then reflects on the approach and suggests ways for further development.
Linguistic Geometry: From Search to Construction is the first book of its kind. Linguistic Geometry (LG) is an approach to the construction of mathematical models for large-scale multi-agent systems. A number of such systems, including air/space combat, robotic manufacturing, software re-engineering and Internet cyberwar, can be modeled as abstract board games. These are games with moves that can be represented by the movement of abstract pieces over locations on an abstract board. The purpose of LG is to provide strategies to guide the games' participants to their goals. Traditionally, discovering such strategies required searches in giant game trees. These searches are often beyond the capacity of modern and even conceivable future computers. LG dramatically reduces the size of the search trees, making the problems computationally tractable. LG provides a formalization and abstraction of search heuristics used by advanced experts including chess grandmasters. Essentially, these heuristics replace search with the construction of strategies. To formalize the heuristics, LG employs the theory of formal languages (i.e. formal linguistics), as well as certain geometric structures over an abstract board. The new formal strategies solve problems from different domains far beyond the areas envisioned by the experts. For a number of these domains, Linguistic Geometry yields optimal solutions.
This book investigates the characteristics of simple versus complex systems, and what the properties of a cyber-physical system design are that contribute to an effective implementation and make the system understandable, simple to use, and easy to maintain. The targeted audience is engineers, managers and advanced students who are involved in the design of cyber-physical systems and are willing to spend some time outside the silo of their daily work in order to widen their background and appreciation for the pervasive problems of system complexity. In the past, design of a process-control system (now called cyber-physical systems) was more of an art than an engineering endeavor. The software technology of that time was concerned primarily with functional correctness and did not pay much attention to the temporal dimension of program execution, which is as important as functional correctness when a physical process must be controlled. In the ensuing years, many problems in the design of cyber-physical systems were simplified. But with an increase in the functional requirements and system size, the complexity problems have appeared again in a different disguise. A sound understanding of the complexity problem requires some insight in cognition, human problem solving, psychology, and parts of philosophy. This book presents the essence of the author's thinking about complexity, accumulated over the past forty years.
ED-L2L, Learning to Live in the Knowledge Society, is one of the co-located conferences of the 20th World Computer Congress (WCC2008). The event is organized under the auspices of IFIP (International Federation for Information Processing) and is to be held in Milan from 7th to 10th September 2008. ED-L2L is devoted to themes related to ICT for education in the knowledge society. It provides an international forum for professionals from all continents to discuss research and practice in ICT and education. The event brings together educators, researchers, policy makers, curriculum designers, teacher educators, members of academia, teachers and content producers. ED-L2L is organised by the IFIP Technical Committee 3, Education, with the support of the Institute for Educational Technology, part of the National Research Council of Italy. The Institute is devoted to the study of educational innovation brought about through the use of ICT. Submissions to ED-L2L are published in this conference book. The published papers are devoted to the published conference themes: Developing digital literacy for the knowledge society: information problem solving, creating, capturing and transferring knowledge, commitment to lifelong learning Teaching and learning in the knowledge society, playful and fun learning at home and in the school New models, processes and systems for formal and informal learning environments and organisations Developing a collective intelligence, learning together and sharing knowledge ICT issues in education - ethics, equality, inclusion and parental role Educating ICT professionals for the global knowledge society Managing the transition to the knowledge society
Synthesis of Finite State Machines: Functional Optimization is one of two monographs devoted to the synthesis of Finite State Machines (FSMs). This volume addresses functional optimization, whereas the second addresses logic optimization. By functional optimization here we mean the body of techniques that: compute all permissible sequential functions for a given topology of interconnected FSMs, and select a best' sequential function out of the permissible ones. The result is a symbolic description of the FSM representing the chosen sequential function. By logic optimization here we mean the steps that convert a symbolic description of an FSM into a hardware implementation, with the goal to optimize objectives like area, testability, performance and so on. Synthesis of Finite State Machines: Functional Optimization is divided into three parts. The first part presents some preliminary definitions, theories and techniques related to the exploration of behaviors of FSMs. The second part presents an implicit algorithm for exact state minimization of incompletely specified finite state machines (ISFSMs), and an exhaustive presentation of explicit and implicit algorithms for the binate covering problem. The third part addresses the computation of permissible behaviors at a node of a network of FSMs and the related minimization problems of non-deterministic finite state machines (NDFSMs). Key themes running through the book are the exploration of behaviors contained in a non-deterministic FSM (NDFSM), and the representation of combinatorial problems arising in FSM synthesis by means of Binary Decision Diagrams (BDDs). Synthesis of Finite State Machines: Functional Optimization will be of interest to researchers and designers in logic synthesis, CAD and design automation.
One service mathematics has rendered the 'Bt mm, ... si j'avait su comment en revenir, human race. It has put common sense back je n'y serais point alIe.' Jules Verne where it belongs. on the topmost shelf next to the dusty canister labelled 'discarded non The series is divergent; therefore we may be sense'. Eric T. Bell able to do something with it. O. Heavisidc Mathematics is a tool for thought. A highly necessary tool in a world where both feedback and non linearities abound. Similarly, all kinds of parts of mathematics serve as tools for other parts and for other sciences. Applying a simple rewriting rule to the quote on the right above one finds such statements as: 'One service topology has rendered mathematical physics .. .'; 'One service logic has rendered com puter science .. .'; 'One service category theory has rendered mathematics .. .'. All arguably true. And all statements obtainable this way form part of the raison d'etre of this series."
This book presents the proceedings of the 1st International Symposium on Intelligent and Distributed Computing, IDC 2007, held in Craiova, Romania, October 2007. Coverage includes: autonomous and adaptive computing; data mining and knowledge discovery; distributed problem solving and decision making; e-business, e-health and e-learning; genetic algorithms; image processing; information retrieval; intelligence in mobile and ubiquitous computing. |
You may like...
Microbial Bioprocessing of Agri-food…
Gustavo Molina, Minaxi Sharma, …
Hardcover
R4,818
Discovery Miles 48 180
Environmental Fate and Safety Management…
John Marshall Clark, Hideo Ohkawa
Hardcover
R2,553
Discovery Miles 25 530
Co-engineering Applications and Adaptive…
Jay Ramanathan, Rajiv Ramnath
Hardcover
R4,196
Discovery Miles 41 960
Diverse Perspectives and…
Thomas M. Connolly, Petros Papadopoulos, …
Hardcover
R9,386
Discovery Miles 93 860
Financial Analysis With Microsoft Excel
Timothy Mayes
Paperback
E-Novation for Competitive Advantage in…
Hugh M Pattinson, David R Low
Hardcover
R4,574
Discovery Miles 45 740
|