![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > Applications of computing > General
Document Processing and Retrieval: TEXPROS focuses on the design and implementation of a personal, customizable office information and document processing system called TEXPROS (a TEXt PROcessing System). TEXPROS is a personal, intelligent office information and document processing system for text-oriented documents. This system supports the storage, classification, categorization, retrieval and reproduction of documents, as well as extracting, browsing, retrieving and synthesizing information from a variety of documents. When using TEXPROS in a multi-user or distributed environment, it requires specific protocols for extracting, storing, transmitting and exchanging information. The authors have used a variety of techniques to implement TEXPROS, such as Object-Oriented Programming, Tcl/Tk, X-Windows, etc. The system can be used for many different purposes in many different applications, such as digital libraries, software documentation and information delivery. Audience: Provides in-depth, state-of-the-art coverage of information processing and retrieval, and documentation for such professionals as database specialists, information systems and software developers, and information providers.
The built environment has been digitizing rapidly and is now transforming into a physical world that is at all times supplemented by a fully web-supported and interconnected digital version, often referred to as Digital Twin. This book shows how diverse data models and web technologies can be created and used for the built environment. Key features of this book are its technical nature and technical detail. The first part of the book highlights a large diversity of IT techniques and their use in the AEC domain, from JSON to XML to EXPRESS to RDF/OWL, for modelling geometry, products, properties, sensor and energy data. The second part of the book focuses on diverse software solutions and approaches, including digital twins, federated data storage on the web, IoT, cloud computing, and smart cities. Key research and strategic development opportunities are comprehensively discussed for distributed web-based building data management, IoT integration and cloud computing. This book aims to serve as a guide and reference for experts and professionals in AEC computing and digital construction including Master's students, PhD researchers, and junior to senior IT-oriented AEC professionals.
Scilab and its Scicos block diagram graphical editor, with a special emphasis on modeling and simulation tools. The first part is a detailed Scilab tutorial, and the second is dedicated to modeling and simulation of dynamical systems in Scicos. The concepts are illustrated through numerous examples, and all code used in the book is available to the reader.
ED-L2L, Learning to Live in the Knowledge Society, is one of the co-located conferences of the 20th World Computer Congress (WCC2008). The event is organized under the auspices of IFIP (International Federation for Information Processing) and is to be held in Milan from 7th to 10th September 2008. ED-L2L is devoted to themes related to ICT for education in the knowledge society. It provides an international forum for professionals from all continents to discuss research and practice in ICT and education. The event brings together educators, researchers, policy makers, curriculum designers, teacher educators, members of academia, teachers and content producers. ED-L2L is organised by the IFIP Technical Committee 3, Education, with the support of the Institute for Educational Technology, part of the National Research Council of Italy. The Institute is devoted to the study of educational innovation brought about through the use of ICT. Submissions to ED-L2L are published in this conference book. The published papers are devoted to the published conference themes: Developing digital literacy for the knowledge society: information problem solving, creating, capturing and transferring knowledge, commitment to lifelong learning Teaching and learning in the knowledge society, playful and fun learning at home and in the school New models, processes and systems for formal and informal learning environments and organisations Developing a collective intelligence, learning together and sharing knowledge ICT issues in education - ethics, equality, inclusion and parental role Educating ICT professionals for the global knowledge society Managing the transition to the knowledge society
Linguistic Geometry: From Search to Construction is the first book of its kind. Linguistic Geometry (LG) is an approach to the construction of mathematical models for large-scale multi-agent systems. A number of such systems, including air/space combat, robotic manufacturing, software re-engineering and Internet cyberwar, can be modeled as abstract board games. These are games with moves that can be represented by the movement of abstract pieces over locations on an abstract board. The purpose of LG is to provide strategies to guide the games' participants to their goals. Traditionally, discovering such strategies required searches in giant game trees. These searches are often beyond the capacity of modern and even conceivable future computers. LG dramatically reduces the size of the search trees, making the problems computationally tractable. LG provides a formalization and abstraction of search heuristics used by advanced experts including chess grandmasters. Essentially, these heuristics replace search with the construction of strategies. To formalize the heuristics, LG employs the theory of formal languages (i.e. formal linguistics), as well as certain geometric structures over an abstract board. The new formal strategies solve problems from different domains far beyond the areas envisioned by the experts. For a number of these domains, Linguistic Geometry yields optimal solutions.
The five digital forces (mobility and pervasive computing, cloud, big data, artificial intelligence and robotics, and social media) are poised to bring great academic and industrial breakthroughs. All stakeholders want to understand how to best harness these forces to their advantage. While literature exists for understanding each force independently, there is a lack of knowledge on how to utilize all the forces together to realize future enterprises. Advanced Digital Architectures for Model-Driven Adaptive Enterprises is an essential reference source that explores the potential in unifying the five digital forces to achieve increased levels of agility, efficiency, and scale. Featuring coverage on a wide range of topics including socio-technical systems, adaptive architectures, and enterprise modeling, this book is ideally designed for managers, executives, programmers, designers, computer engineers, entrepreneurs, tool builders, digital practitioners, researchers, academicians, ands students at the graduate level.
Synthesis of Finite State Machines: Functional Optimization is one of two monographs devoted to the synthesis of Finite State Machines (FSMs). This volume addresses functional optimization, whereas the second addresses logic optimization. By functional optimization here we mean the body of techniques that: compute all permissible sequential functions for a given topology of interconnected FSMs, and select a best' sequential function out of the permissible ones. The result is a symbolic description of the FSM representing the chosen sequential function. By logic optimization here we mean the steps that convert a symbolic description of an FSM into a hardware implementation, with the goal to optimize objectives like area, testability, performance and so on. Synthesis of Finite State Machines: Functional Optimization is divided into three parts. The first part presents some preliminary definitions, theories and techniques related to the exploration of behaviors of FSMs. The second part presents an implicit algorithm for exact state minimization of incompletely specified finite state machines (ISFSMs), and an exhaustive presentation of explicit and implicit algorithms for the binate covering problem. The third part addresses the computation of permissible behaviors at a node of a network of FSMs and the related minimization problems of non-deterministic finite state machines (NDFSMs). Key themes running through the book are the exploration of behaviors contained in a non-deterministic FSM (NDFSM), and the representation of combinatorial problems arising in FSM synthesis by means of Binary Decision Diagrams (BDDs). Synthesis of Finite State Machines: Functional Optimization will be of interest to researchers and designers in logic synthesis, CAD and design automation.
One service mathematics has rendered the 'Bt mm, ... si j'avait su comment en revenir, human race. It has put common sense back je n'y serais point alIe.' Jules Verne where it belongs. on the topmost shelf next to the dusty canister labelled 'discarded non The series is divergent; therefore we may be sense'. Eric T. Bell able to do something with it. O. Heavisidc Mathematics is a tool for thought. A highly necessary tool in a world where both feedback and non linearities abound. Similarly, all kinds of parts of mathematics serve as tools for other parts and for other sciences. Applying a simple rewriting rule to the quote on the right above one finds such statements as: 'One service topology has rendered mathematical physics .. .'; 'One service logic has rendered com puter science .. .'; 'One service category theory has rendered mathematics .. .'. All arguably true. And all statements obtainable this way form part of the raison d'etre of this series."
The book Computer Applications in Engineering and Management is about computer applications in management, electrical engineering, electronics engineering, and civil engineering. It covers the software tools for office automation, introduces the basic concepts of database management, and provides an overview about the concepts of data communication, internet, and e-commerce. Additionally, the book explains the principles of computing management used in construction of buildings in civil engineering and the role of computers in power grid automation in electronics engineering. Features Provides an insight to prospective research and application areas related to industry and technology Includes industry-based inputs Provides a hands-on approach for readers of the book to practice and assimilate learning This book is primarily aimed at undergraduates and graduates in computer science, information technology, civil engineering, electronics and electrical engineering, management, academicians, and research scholars.
This book presents the proceedings of the 1st International Symposium on Intelligent and Distributed Computing, IDC 2007, held in Craiova, Romania, October 2007. Coverage includes: autonomous and adaptive computing; data mining and knowledge discovery; distributed problem solving and decision making; e-business, e-health and e-learning; genetic algorithms; image processing; information retrieval; intelligence in mobile and ubiquitous computing.
This book investigates the characteristics of simple versus complex systems, and what the properties of a cyber-physical system design are that contribute to an effective implementation and make the system understandable, simple to use, and easy to maintain. The targeted audience is engineers, managers and advanced students who are involved in the design of cyber-physical systems and are willing to spend some time outside the silo of their daily work in order to widen their background and appreciation for the pervasive problems of system complexity. In the past, design of a process-control system (now called cyber-physical systems) was more of an art than an engineering endeavor. The software technology of that time was concerned primarily with functional correctness and did not pay much attention to the temporal dimension of program execution, which is as important as functional correctness when a physical process must be controlled. In the ensuing years, many problems in the design of cyber-physical systems were simplified. But with an increase in the functional requirements and system size, the complexity problems have appeared again in a different disguise. A sound understanding of the complexity problem requires some insight in cognition, human problem solving, psychology, and parts of philosophy. This book presents the essence of the author's thinking about complexity, accumulated over the past forty years.
This book will present the results of the EpiAim study, exploring and describing the current situation and trends in the use of Health Informatics and Telematics in Africa an Latin America, two regions that despite their peculiarities and complexity, are witnessing a sustained interest in these new technologies. In fact, rapid changes currently taking place are "putting these countries on the map" of the global information society. The book should help to achieve a better understanding of the opportunities in health informatics for the advancement of technical co-operation between Europe and developing regions, and a view for future potential business opportunities in emerging markets.
Geotechnical works involve complex geo-engineering issues, which are reviewed in this volume presenting the very latest research and practice in computational mechanics in geotechnical engineering. Application of Computational Mechanics in Geotechnical Engineering V contains contributions on soil and rock excavations, underground structures and ground reinforcement; and on the construction of dams, embankments and rail track. Other papers consider the geomechanics of oil exploration and rock mechanics in mining; while environmental contributions include groundwater management. A wide range of methodologies are discussed: inversed methodologies, artificial intelligence and computational systems, which highlight future trends in the area of computational mechanics applied to geotechnical problems. The book will be of interest to researchers, academics, students, software developers, and practical engineers across the field of geotechnics.
30 tutorials and more than 100 exercises in chemoinformatics, supported by online software and data sets Chemoinformatics is widely used in both academic and industrial chemical and biochemical research worldwide. Yet, until this unique guide, there were no books offering practical exercises in chemoinformatics methods. Tutorials in Chemoinformatics contains more than 100 exercises in 30 tutorials exploring key topics and methods in the field. It takes an applied approach to the subject with a strong emphasis on problem-solving and computational methodologies. Each tutorial is self-contained and contains exercises for students to work through using a variety of software packages. The majority of the tutorials are divided into three sections devoted to theoretical background, algorithm description and software applications, respectively, with the latter section providing step-by-step software instructions. Throughout, three types of software tools are used: in-house programs developed by the authors, open-source programs and commercial programs which are available for free or at a modest cost to academics. The in-house software and data sets are available on a dedicated companion website. Key topics and methods covered in Tutorials in Chemoinformatics include: * Data curation and standardization * Development and use of chemical databases * Structure encoding by molecular descriptors, text strings and binary fingerprints * The design of diverse and focused libraries * Chemical data analysis and visualization * Structure-property/activity modeling (QSAR/QSPR) * Ensemble modeling approaches, including bagging, boosting, stacking and random subspaces *3D pharmacophores modeling and pharmacological profiling using shape analysis * Protein-ligand docking * Implementation of algorithms in a high-level programming language Tutorials in Chemoinformatics is an ideal supplementary text for advanced undergraduate and graduate courses in chemoinformatics, bioinformatics, computational chemistry, computational biology, medicinal chemistry and biochemistry. It is also a valuable working resource for medicinal chemists, academic researchers and industrial chemists looking to enhance their chemoinformatics skills.
Applied Linear Regression for Business Analytics with R introduces regression analysis to business students using the R programming language with a focus on illustrating and solving real-time, topical problems. Specifically, this book presents modern and relevant case studies from the business world, along with clear and concise explanations of the theory, intuition, hands-on examples, and the coding required to employ regression modeling. Each chapter includes the mathematical formulation and details of regression analysis and provides in-depth practical analysis using the R programming language.
Electronic services networks--systems of terminals and computers linked by telecommunication apparatus and used to process transactions--have had an increasing influence on industrial structures and commercial practices over the past decade. Margaret Guerin-Calvert and Steven Wildman have assembled diverse essays representing the best of current thinking on these networks. The book provides the reader with varied theoretical perspectives on ESNs and their effects on business and finance and contains five case studies that apply these theoretical ideas to issues raised by the proliferation of these networks. Unlike other works, which have focused on ESNs as features of specific industries, this collection explores the networks themselves as economic phenomena. The contributions are grouped into two parts. The first presents general theoretical perspectives on the economics of various ESNs, their effects on the industries and markets that employ them, and the policy issues they raise. Among the topics discussed are structural relationships among ESNs, their effect on organizational structures, compatibility between shared networks, and competitive search facilitation. In Part II, the contributors offer a detailed look at the economic policy histories of ESNs in specific industries, including banking, real estate, airlines, and travel. There are discussions of automatic teller machines, computer reservation systems, multiple-listing services, and electronic data interchange. These studies demonstrate the incredible variety of applications of ESN technology and make this an indispensable resource for professionals in all types of businesses that use or could use ESNs, as well as for students in a wide range of law, business, and public policy courses.
This volume is about "Structure." The search for "structure," always the pursuit of sciences within their specific areas and perspectives, is witnessing these days a dra matic revolution. The coexistence and interaction of so many structures (atoms, hu mans, cosmos and all that there is in between) would be unconceivable according to many experts, if there were not, behind it all, some gen eral organizational principle. s that (at least in some asymptotic way) make possible so many equilibria among species and natural objects, fan tastically tuned to an extremely high degree of precision. The evidence accumulates to an increasingly impressive degree; a concrete example comes from physics, whose constant aim always was and is that of searching for "ultimate laws," out of which everything should follow, from quarks to the cosmos. Our notions and philosophy have un dergone major revolutions, whenever the "unthinkable" has been changed by its wonderful endeavours into "fact." Well, it is just from physics that evidence comes: even if the "ultimate" could be reached, it would not in any way be a terminal point. When "complexity" comes into the game, entirely new notions have to be invented; they all have to do with "structure," though this time in a much wider sense than would have been understood a decade or so ago."
Environmental Informatics is a fast growing field which deals with all methods from computer science, environmental planning, ecology and related subjects. As well as being an interdisciplinary area, Environmental Informatics provides an interface between all involved professional groups. Monitoring the state of the environment, analysing existing data, presenting the data to scientists and the public, as well as providing decision support are only some of the topics involved. Environmental Informatics is therefore a good foundation for the computer-assisted protection of the environment.
Distribution and interoperability in heterogeneous computing environments are the key requirements for state-of-the-art information processing systems. Distributed applications are making a critical contribution in many application sectors, such as office automation, finance, manufacturing, telecommunications, aerospace, and transportation. Users demand support for the construction, integration and management of their application systems as well as for the interoperability of independent application components. DAIS '97 provides a forum for researchers, application designers and users to review, discuss and learn about new approaches and concepts in the fields of distributed applications. DAIS '97 will especially focus on the interoperability between different applications and services, different implementations of the same and of different distributed platforms.
Welcome to IWQOS'97 in New York City! Over the past several years, there has been a considerable amount of research within the field of Quality of Service (QOS). Much of that work has taken place within the context of QOS support for distributed multimedia systems, operating systems, transport subsystems, networks, devices and formal languages. The objective of the Fifth International Workshop on Quality of Service (IWQOS) is to bring together researchers, developers and practitioners working in all facets of QOS research. While many workshops and conferences offer technical sessions on the topic QOS, none other than IWQOS, provide a single-track workshop dedicated to QOS research. The theme of IWQOS'97 is building QOS into distributed systems. Implicit in that theme is the notion that the QOS community should now focus on discussing results from actual implementations of their work. As QOS research moves from theory to practice, we are interested in gauging the impact of ideas discussed at previous workshops on development of actual systems. While we are interested in experimental results, IWQOS remains a forum for fresh and innovative ideas emerging in the field. As a result of this, authors were solicited to provide experimental research (long) papers and more speculative position (short) statements for consideration. We think we have a great invited and technical program lined up for you this year. The program reflects the Program Committees desire to hear about experiment results, controversial QOS subjects and retrospectives on where we are and where we are going.
The idea that games can have positive impacts upon critical thinking and problem solving is widely accepted in today's digital society, yet the effect of video games on human cognition is still largely unexplored. Gaming and Cognition: Theories And Practice From The Learning Sciences applies the principles of research in the study of human cognition to video games, providing a critical examination of the rigor and design of the experiments in the study of cognition and gaming. Combining many aspects of the learning sciences such as psychology, instructional design, and education into one coherent whole, this book presents historical, theoretical, and practical perspectives.
Finite model theory,as understoodhere, is an areaof mathematicallogic that has developed in close connection with applications to computer science, in particular the theory of computational complexity and database theory. One of the fundamental insights of mathematical logic is that our understanding of mathematical phenomena is enriched by elevating the languages we use to describe mathematical structures to objects of explicit study. If mathematics is the science of patterns, then the media through which we discern patterns, as well as the structures in which we discern them, command our attention. It isthis aspect oflogicwhichis mostprominentin model theory,"thebranchof mathematical logic which deals with the relation between a formal language and its interpretations". No wonder, then, that mathematical logic, and ?nite model theory in particular, should ?nd manifold applications in computer science: from specifying programs to querying databases, computer science is rife with phenomena whose understanding requires close attention to the interaction between language and structure. This volume gives a broadoverviewof some central themes of ?nite model theory: expressive power, descriptive complexity, and zero-one laws, together with selected applications to database theory and arti?cial intelligence, es- cially constraint databases and constraint satisfaction problems. The ?nal chapter provides a concise modern introduction to modal logic,which emp- sizes the continuity in spirit and technique with ?nite model theory.
This volume contains the proceedings of the IFIPTM 2008, the Joint iTrust and PST Conferences on Privacy, Trust Management and Security, held in Trondheim, Norway from June 18 to June 20, 2008. IFIPTM 2008 provides a truly global platform for the reporting of research, development, policy and practice in the interdependent areas of Privacy, Security, and Trust. Following the traditions inherited from the highly successful iTrust and PST conference series, IFIPTM 2008 focuses on trust, privacy and security from multidisciplinary perspectives. The conference is an arena for discussion about re levant problems from both research and practice in the areas of academia, busi ness, and government. IFIPTM 2008 is an open IFIP conference, which only accepts contributed pa pers, so all papers in these proceedings have passed strict peer review. The pro gram of the conference features both theoretical research papers and reports of real world case studies. IFIPTM 2008 received 62 submissions. The program commit tee selected 22 papers for presentation and inclusion in the proceedings. In addi tion, the program and the proceedings include 3 demo descriptions. The highlights of IFIPTM 2008 include invited talks and tutorials by industri al and academic experts in the fields of trust management, privacy and security, including Jon Bing and Michael Steiner.
The present textbook contains the recordsof a two-semester course on que- ing theory, including an introduction to matrix-analytic methods. This course comprises four hours oflectures and two hours of exercises per week andhas been taughtattheUniversity of Trier, Germany, for about ten years in - quence. The course is directed to last year undergraduate and?rst year gr- uate students of applied probability and computer science, who have already completed an introduction to probability theory. Its purpose is to present - terial that is close enough to concrete queueing models and their applications, while providing a sound mathematical foundation for the analysis of these. Thus the goal of the present book is two-fold. On the one hand, students who are mainly interested in applications easily feel bored by elaborate mathematical questions in the theory of stochastic processes. The presentation of the mathematical foundations in our courses is chosen to cover only the necessary results, which are needed for a solid foundation of the methods of queueing analysis. Further, students oriented - wards applications expect to have a justi?cation for their mathematical efforts in terms of immediate use in queueing analysis. This is the main reason why we have decided to introduce new mathematical concepts only when they will be used in the immediate sequel. On the other hand, students of applied probability do not want any heur- tic derivations just for the sake of yielding fast results for the model at hand.
Your secret weapon to understanding--and using!--one of the most powerful influences in the world today From your Facebook News Feed to your most recent insurance premiums--even making toast!--algorithms play a role in virtually everything that happens in modern society and in your personal life. And while they can seem complicated from a distance, the reality is that, with a little help, anyone can understand--and even use--these powerful problem-solving tools! In Algorithms For Dummies, you'll discover the basics of algorithms, including what they are, how they work, where you can find them (spoiler alert: everywhere!), who invented the most important ones in use today (a Greek philosopher is involved), and how to create them yourself. You'll also find: Dozens of graphs and charts that help you understand the inner workings of algorithms Links to an online repository called GitHub for constant access to updated code Step-by-step instructions on how to use Google Colaboratory, a zero-setup coding environment that runs right from your browser Whether you're a curious internet user wondering how Google seems to always know the right answer to your question or a beginning computer science student looking for a head start on your next class, Algorithms For Dummies is the can't-miss resource you've been waiting for. |
![]() ![]() You may like...
Ties that bind - Race and the politics…
Shannon Walsh, Jon Soske
Paperback
Fuzzy Discrete Structures
Davender S. Malik, John N. Mordeson
Hardcover
R3,189
Discovery Miles 31 890
Worry Monsters - A Child's Guide to…
Summersdale Publishers Ltd
Paperback
R183
Discovery Miles 1 830
Fuzzy Mathematical Programming - Methods…
Young-Jou Lai, Ching-Lai Hwang
Paperback
R1,620
Discovery Miles 16 200
Mathematical Principles of Fuzzy Logic
Vil'em Novak, Irina Perfilieva, …
Hardcover
R6,071
Discovery Miles 60 710
|