![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Applications of computing > Artificial intelligence > Knowledge-based systems / expert systems
The building blocks of today's and future embedded systems are complex intellectual property components, or cores, many of which are programmable processors. Traditionally, these embedded processors mostly have been pro grammed in assembly languages due to efficiency reasons. This implies time consuming programming, extensive debugging, and low code portability. The requirements of short time-to-market and dependability of embedded systems are obviously much better met by using high-level language (e.g. C) compil ers instead of assembly. However, the use of C compilers frequently incurs a code quality overhead as compared to manually written assembly programs. Due to the need for efficient embedded systems, this overhead must be very low in order to make compilers useful in practice. In turn, this requires new compiler techniques that take the specific constraints in embedded system de sign into account. An example are the specialized architectures of recent DSP and multimedia processors, which are not yet sufficiently exploited by existing compilers."
There is a tremendous interest in the design and applications of agents in virtually every area including avionics, business, internet, engineering, health sciences and management. There is no agreed one definition of an agent but we can define an agent as a computer program that autonomously or semi-autonomously acts on behalf of the user. In the last five years transition of intelligent systems research in general and agent based research in particular from a laboratory environment into the real world has resulted in the emergence of several phenomenon. These trends can be placed in three catego ries, namely, humanization, architectures and learning and adapta tion. These phenomena are distinct from the traditional logic centered approach associated with the agent paradigm. Humaniza tion of agents can be understood among other aspects, in terms of the semantics quality of design of agents. The need to humanize agents is to allow practitioners and users to make more effective use of this technology. It relates to the semantic quality of the agent design. Further, context-awareness is another aspect which has as sumed importance in the light of ubiquitous computing and ambi ent intelligence. The widespread and varied use of agents on the other hand has cre ated a need for agent-based software development frameworks and design patterns as well architectures for situated interaction, nego tiation, e-commerce, e-business and informational retrieval. Fi- vi Preface nally, traditional agent designs did not incorporate human-like abilities of learning and adaptation."
Computerarchitecturepresentlyfacesanunprecedentedrevolution: Thestep from monolithic processors towards multi-core ICs, motivated by the ever - creasingneedforpowerandenergyef ciencyinnanoelectronics. Whetheryou prefer to call it MPSoC (multi-processor system-on-chip) or CMP (chip mul- processor), no doubt this revolution affects large domains of both computer science and electronics, and it poses many new interdisciplinary challenges. For instance, ef cient programming models and tools for MPSoC are largely an open issue: "Multi-core platforms are a reality - but where is the software support" (R. Lauwereins, IMEC). Solving it will require enormous research efforts as well as the education of a whole new breed of software engineers that bring the results from universities into industrial practice. Atthesametime, thedesignofcomplexMPSoCarchitecturesisanextremely time-consuming task, particularly in the wireless and multimedia application domains, where heterogeneous architectures are predominant. Due to the - ploding NRE and mask costs most companies are now following a platform approach: Invest a large (but one-time) design effort into a proper core - chitecture, and create easy-to-design derivatives for new standards or product features. Needless to say, only the most ef cient MPSoC platforms have a real chance to enjoy a multi-year lifetime on the highly competitive semiconductor market for embedded systems.
This volume contains the extended papers selected for presentation at the ninth edition of the International Symposium on Web & Wireless Geographical Information Systems 2 (WGIS 2009) hosted by the National Centre for Geocomputation in NUI Maynooth 2 (Ireland). WGIS 2009 was the ninth in a series of successful events beginning with Kyoto 2001, and alternating locations between East Asia and Europe. We invited s- missions that provided an up-to-date review of advances in theoretical, technical, and 2 practical issues of W GIS and Intelligent GeoMedia. Reports on ongoing implemen- tions and real-world applications research were particularly welcome at this symposium. 2 Now in its ninth year, the scope of W GIS has expanded to include continuing - vances in wireless and Internet technologies that generate ever increasing interest in the diffusion, usage, and processing of geo-referenced data of all types - geomedia. Spatially aware wireless and Internet devices offer new ways of accessing and anal- ing geo-spatial information in both real-world and virtual spaces. Consequently, new challenges and opportunities are provided that expand the traditional GIS research scope into the realm of intelligent media - including geomedia with context-aware behaviors for self-adaptive use and delivery. Our common aim is research-based innovation that increases the ease of creating, delivering, and using geomedia across different platforms and application domains that continue to have dramatic effect on today's society.
This book constitutes the refereed proceedings of the 4th International Workshop on Self-Organizing Systems, IWSOS 2009, held in Zurich, Switzerland, in December 2009. The 14 revised full papers and 13 revised short papers presented were carefully selected from the 34 full and 27 short paper submissions. The papers are organized in topical sections on ad hoc and sensor networks; services, storage, and internet routing; peer-to-peer systems; theory and general approaches; overlay networks; peer-to-peer systems and internet routing; wireless networks; and network topics.
This book constitutes the joint refereed proceedings of the Third International Workshop on Communication Technologies for Vehicles, Nets4Cars 2011and the First International Workshop on Communication Technologies for Vehicles in the Railway Transportation, Nets4Trains 2011, held in Oberpfaffenhofen, Germany, in March 2011. The 7 full papers of the rail track and 12 full papers of the road track presented together with a keynote were carefully reviewed and selected from 13 and 21 submissions respectively. They provide an overview over the latest technologies and research in the field of intra- and inter-vehicle communication and present original research results in areas relating to communication protocols and standards, mobility and traffic models, experimental and field operational testing, and performance analysis.
As future generation information technology (FGIT) becomes specialized and fr- mented, it is easy to lose sight that many topics in FGIT have common threads and, because of this, advances in one discipline may be transmitted to others. Presentation of recent results obtained in different disciplines encourages this interchange for the advancement of FGIT as a whole. Of particular interest are hybrid solutions that c- bine ideas taken from multiple disciplines in order to achieve something more signi- cant than the sum of the individual parts. Through such hybrid philosophy, a new principle can be discovered, which has the propensity to propagate throughout mul- faceted disciplines. FGIT 2009 was the first mega-conference that attempted to follow the above idea of hybridization in FGIT in a form of multiple events related to particular disciplines of IT, conducted by separate scientific committees, but coordinated in order to expose the most important contributions. It included the following international conferences: Advanced Software Engineering and Its Applications (ASEA), Bio-Science and Bio-Technology (BSBT), Control and Automation (CA), Database Theory and Application (DTA), D- aster Recovery and Business Continuity (DRBC; published independently), Future G- eration Communication and Networking (FGCN) that was combined with Advanced Communication and Networking (ACN), Grid and Distributed Computing (GDC), M- timedia, Computer Graphics and Broadcasting (MulGraB), Security Technology (SecTech), Signal Processing, Image Processing and Pattern Recognition (SIP), and- and e-Service, Science and Technology (UNESST).
This book constitutes the proceedings of the KR4HC 2010 workshop held at ECAI in Lisbon, Portugal, in August 2010. The 11 extended papers presented were carefully reviewed and selected from 19 submissions. The papers cover topics like ontologies, patient data, records, and guidelines, and clinical practice guidelines.
As software systems become increasingly ubiquitous, issues of dependability become ever more crucial. Given that solutions to these issues must be considered from the very beginning of the design process, it is reasonable that dependability and security are addressed at the architectural level. This book has originated from an effort to bring together the research communities of software architectures, dependability and security. This state-of-the-art survey contains expanded and peer-reviewed papers based on the carefully selected contributions to two workshops: the Workshop on Architecting Dependable Systems (WADS 2008), organized at the 2008 International Conference on Dependable Systems and Networks (DSN 2008), held in Anchorage, Alaska, USA, in June 2008, and the Third International Workshop on Views On Designing Complex Architectures (VODCA 2008) held in Bertinoro, Italy, in August 2008. It also contains invited papers written by recognized experts in the area. The 13 papers are organized in topical sections on dependable service-oriented architectures, fault-tolerance and system evaluation, and architecting security.
This volume contains the proceedings of the Second International Workshop on Mobile Entity Localization and Tracking in GPS-less Environments (MELT 2009), held in Orlando, Florida on September 30, 2009 in conjunction with the 11th International Conference on Ubiquitous Computing (Ubicomp 2009). MELT provides a forum for the presentation of state-of-the-art technologies in mobile localization and tracking and novel applications of location-based s- vices. MELT 2009 continued the success of the ?rst workshop in the series (MELT 2008), which was held is San Francisco, California on September 19, 2008 in conjunction with Mobicom. Location-awareness is a key component for achieving context-awareness. - cent years have witnessed an increasing trend towards location-based services and applications. In most cases, however, location information is limited by the accessibility to GPS, which is unavailable for indoor or underground fac- ities and unreliable in urban environments. Much research has been done, in both the sensor network community and the ubiquitous computing community, to provide techniques for localization and tracking in GPS-less environments. Novel applications based on ad-hoc localization and real-time tracking of - bile entities are growing as a result of these technologies. MELT brings together leaders from both the academic and industrial research communities to discuss challenging and open problems, to evaluate pros and cons of various approaches, to bridge the gap between theory and applications, and to envision new research opportunities.
This book is the first in aseries on novellow power design architectures, methods and design practices. It results from of a large European project started in 1997, whose goal is to promote the further development and the faster and wider industrial use of advanced design methods for reducing the power consumption of electronic systems. Low power design became crucial with the wide spread of portable information and cornrnunication terminals, where a small battery has to last for a long period. High performance electronics, in addition, suffers from a permanent increase of the dissipated power per square millimetre of silicon, due to the increasing eIock-rates, which causes cooling and reliability problems or otherwise limits the performance. The European Union's Information Technologies Programme 'Esprit' did there fore launch a 'Pilot action for Low Power Design', wh ich eventually grew to 19 R&D projects and one coordination project, with an overall budget of 14 million Euro. It is meanwhile known as European Low Power Initiative for Electronic System Design (ESD-LPD) and will be completed by the end of 2001. It involves 30 major Euro pean companies and 20 well-known institutes. The R&D projects aims to develop or demonstrate new design methods for power reduction, while the coordination project takes care that the methods, experiences and results are properly documented and pub licised."
Model Based Fuzzy Control uses a given conventional or fuzzy open loop model of the plant under control to derive the set of fuzzy rules for the fuzzy controller. Of central interest are the stability, performance, and robustness of the resulting closed loop system. The major objective of model based fuzzy control is to use the full range of linear and nonlinear design and analysis methods to design such fuzzy controllers with better stability, performance, and robustness properties than non-fuzzy controllers designed using the same techniques. This objective has already been achieved for fuzzy sliding mode controllers and fuzzy gain schedulers - the main topics of this book. The primary aim of the book is to serve as a guide for the practitioner and to provide introductory material for courses in control theory.
With the advent of portable and autonomous computing systems, power con sumption has emerged as a focal point in many research projects, commercial systems and DoD platforms. One current research initiative, which drew much attention to this area, is the Power Aware Computing and Communications (PAC/C) program sponsored by DARPA. Many of the chapters in this book include results from work that have been supported by the PACIC program. The performance of computer systems has been tremendously improving while the size and weight of such systems has been constantly shrinking. The capacities of batteries relative to their sizes and weights has been also improv ing but at a rate which is much slower than the rate of improvement in computer performance and the rate of shrinking in computer sizes. The relation between the power consumption of a computer system and it performance and size is a complex one which is very much dependent on the specific system and the technology used to build that system. We do not need a complex argument, however, to be convinced that energy and power, which is the rate of energy consumption, are becoming critical components in computer systems in gen eral, and portable and autonomous systems, in particular. Most of the early research on power consumption in computer systems ad dressed the issue of minimizing power in a given platform, which usually translates into minimizing energy consumption, and thus, longer battery life."
The design of computer systems to be embedded in critical real-time applications is a complex task. Such systems must not only guarantee to meet hard real-time deadlines imposed by their physical environment, they must guarantee to do so dependably, despite both physical faults (in hardware) and design faults (in hardware or software). A fault-tolerance approach is mandatory for these guarantees to be commensurate with the safety and reliability requirements of many life- and mission-critical applications. This book explains the motivations and the results of a collaborative project', whose objective was to significantly decrease the lifecycle costs of such fault tolerant systems. The end-user companies participating in this project already deploy fault-tolerant systems in critical railway, space and nuclear-propulsion applications. However, these are proprietary systems whose architectures have been tailored to meet domain-specific requirements. This has led to very costly, inflexible, and often hardware-intensive solutions that, by the time they are developed, validated and certified for use in the field, can already be out-of-date in terms of their underlying hardware and software technology."
The aim of the FMICS workshop series is to provide a forum for researchers who are interested in the development and application of formal methods in industry. In particular, these workshops are intended to bring together scientists and practitioners who are active in the area of formal methods and interested in exchanging their experiences in the industrial usage of these methods. These workshopsalso striveto promoteresearchand developmentfor the improvement of formal methods and tools for industrial applications. The topics for which contributions to FMICS 2008 were solicited included, but were not restricted to, the following: - Design, speci?cation, code generation and testing based on formal methods - Veri?cation and validation of complex, distributed, real-time systems and embedded systems - Veri?cation and validation methods that address shortcomings of existing methods with respect to their industrial applicability (e. g. , scalability and usability issues) - Tools for the development of formal design descriptions - Case studies and experience reports on industrial applications of formal methods, focusing on lessons learned or identi?cation of new research - rections - Impact of the adoption of formal methods on the development process and associated costs - Application of formal methods in standardization and industrial forums The workshop included six sessions of regular contributions in the areas of model checking, testing, software veri?cation, real-time performance, and ind- trial case studies. There were also three invited presentations, given by Steven Miller,Rance Cleaveland,and Werner Damm, coveringthe applicationof formal methods in the avionics and automotive industries.
The 13th International Conference on Human-Computer Interaction, HCI Inter- tional 2009, was held in San Diego, California, USA, July 19-24, 2009, jointly with the Symposium on Human Interface (Japan) 2009, the 8th International Conference on Engineering Psychology and Cognitive Ergonomics, the 5th International Conference on Universal Access in Human-Computer Interaction, the Third International Conf- ence on Virtual and Mixed Reality, the Third International Conference on Internati- alization, Design and Global Development, the Third International Conference on Online Communities and Social Computing, the 5th International Conference on Augmented Cognition, the Second International Conference on Digital Human Mod- ing, and the First International Conference on Human Centered Design. A total of 4,348 individuals from academia, research institutes, industry and gove- mental agencies from 73 countries submitted contributions, and 1,397 papers that were judged to be of high scientific quality were included in the program. These papers - dress the latest research and development efforts and highlight the human aspects of the design and use of computing systems. The papers accepted for presentation thoroughly cover the entire field of human-computer interaction, addressing major advances in knowledge and effective use of computers in a variety of application areas.
Embedded systems take over complex control and data processing tasks in diverse application ?elds such as automotive, avionics, consumer products, and telec- munications. They are the primary driver for improving overall system safety, ef?ciency, and comfort. The demand for further improvement in these aspects can only be satis?ed by designing embedded systems of increasing complexity, which in turn necessitates the development of new system design methodologies based on speci?cation, design, and veri?cation languages. The objective of the book at hand is to provide researchers and designers with an overview of current research trends, results, and application experiences in c- puter languages for embedded systems. The book builds upon the most relevant contributions to the 2008 conference Forum on Design Languages (FDL), the p- mier international conference specializing in this ?eld. These contributions have been selected based on the results of reviews provided by leading experts from - search and industry. In many cases, the authors have improved their original work by adding breadth, depth, or explanation.
During the 1980s and early 1990s there was signi?cant work in the design and implementation of hardware neurocomputers. Nevertheless, most of these efforts may be judged to have been unsuccessful: at no time have have ha- ware neurocomputers been in wide use. This lack of success may be largely attributed to the fact that earlier work was almost entirely aimed at developing custom neurocomputers, based on ASIC technology, but for such niche - eas this technology was never suf?ciently developed or competitive enough to justify large-scale adoption. On the other hand, gate-arrays of the period m- tioned were never large enough nor fast enough for serious arti?cial-neur- network (ANN) applications. But technology has now improved: the capacity and performance of current FPGAs are such that they present a much more realistic alternative. Consequently neurocomputers based on FPGAs are now a much more practical proposition than they have been in the past. This book summarizes some work towards this goal and consists of 12 papers that were selected, after review, from a number of submissions. The book is nominally divided into three parts: Chapters 1 through 4 deal with foundational issues; Chapters 5 through 11 deal with a variety of implementations; and Chapter 12 looks at the lessons learned from a large-scale project and also reconsiders design issues in light of current and future technology.
This essential resource for professionals and advanced students in security programming and system design introduces the foundations of programming systems security and the theory behind access control models, and addresses emerging access control mechanisms.
To the hard-pressed systems designer this book will come as a godsend. It is a hands-on guide to the many ways in which processor-based systems are designed to allow low power devices. Covering a huge range of topics, and co-authored by some of the field 's top practitioners, the book provides a good starting point for engineers in the area, and to research students embarking upon work on embedded systems and architectures.
The two volume set LNCS 5506 and LNCS 5507 constitutes the thoroughly refereed post-conference proceedings of the 15th International Conference on Neural Information Processing, ICONIP 2008, held in Auckland, New Zealand, in November 2008. The 260 revised full papers presented were carefully reviewed and selected from numerous ordinary paper submissions and 15 special organized sessions. 116 papers are published in the first volume and 112 in the second volume. The contributions deal with topics in the areas of data mining methods for cybersecurity, computational models and their applications to machine learning and pattern recognition, lifelong incremental learning for intelligent systems, application of intelligent methods in ecological informatics, pattern recognition from real-world information by svm and other sophisticated techniques, dynamics of neural networks, recent advances in brain-inspired technologies for robotics, neural information processing in cooperative multi-robot systems.
This year's edition of the international federated conferences on Distributed Computing Techniques took place in Lisbon during June 9-11, 2009. It was hosted by the Faculty of Sciences of the University of Lisbon and formally or- nized by Instituto de Telecomunica, c oes. The DisCoTecconferences jointly coverthe completespectrum ofdistributed computing subjects ranging from theoretical foundations to formal speci?cation techniques to practical considerations. The event consisted of the 11th Inter- tional Conference on Coordination Models and Languages(COORDINATION), the 9th IFIP International Conference on Distributed Applications and Inter- erable Systems (DAIS), and the IFIP International Conference on Formal Te- niquesforDistributedSystems(FMOODS/FORTE).COORDINATIONfocused on languages, models, and architectures for concurrentand distributed software. DAIS emphasized methods, techniques, and system infrastructures needed to design, build, operate, evaluate, and manage modern distributed applications in any kind of application environment and scenario. FMOODS (11th Formal MethodsforOpenObject-BasedDistributedSystems)joinedforceswithFORTE (29thFormalTechniquesfor NetworkedandDistributed Systems), creatinga - rum for fundamental researchon theory and applications of distributed systems."
The 7th IFIP Workshop on Software Technologies for Future Embedded and Ubiquitous Systems (SEUS) followed on the success of six previous editions in Capri, Italy (2008), Santorini, Greece (2007), Gyeongju, Korea (2006), Seattle, USA (2005), Vienna, Austria (2004), and Hokodate, Japan (2003), establishing SEUS as one of the emerging workshops in the ?eld of embedded and ubiq- tous systems. SEUS 2009 continued the tradition of fostering cross-community scienti?c excellence and establishing strong links between researchand industry. The ?elds of both embedded computing and ubiquitous systems have seen considerable growth over the past few years. Given the advances in these ?elds, and also those in the areas of distributed computing, sensor networks, midd- ware, etc. , the area of ubiquitous embedded computing is now being envisioned as the wayof the future. The systems and technologies that will arise in support of ubiquitous embedded computing will undoubtedly need to address a variety of issues, including dependability, real-time, human-computer interaction, - tonomy, resource constraints, etc. All of these requirements pose a challenge to the research community. The purpose of SEUS 2009 was to bring together - searchersand practitioners with an interest in advancing the state of the artand the state of practice in this emerging ?eld, with the hope of fostering new ideas, collaborations and technologies. SEUS 2009 would not have been possible without the e?ort of many people.
This bookis the resultof merging two workshopsseries, namely, oneon comp- erized guidelines and protocols and the other one on knowledge management for healthcareprocedures. Themergeresultedinthe KR4HCworkshop: Knowledge Representationfor HealthCare: Data, Processes, andGuidelines. This workshop was held in conjunction with the 12th Conference on Arti?cial Intelligence in Medicine (AIME 2009), in Verona, Italy. The book included, in addition to the full-length workshop papers, invited peer-reviewed advanced papers on lessons learned in these ?elds. The KR4HC workshop continued a line of successful guideline workshops held in 2000, 2004, 2006, 2007, and 2008. Following the success of the ?rst - ropean Workshop on Computerized Guidelines and Protocols held in Leipzig, Germany, in 2000, the Symposium on Computerized Guidelines and Protocols (CGP 2004) was organized in Prague, Czech Republic in 2004 to identify use cases for guideline-based applications in health care, computerized methods for supportingtheguidelinedevelopmentprocess, andpressingissuesandpromising approachesfordevelopingusableandmaintainablevehiclesforguidelinedelivery. In 2006 an ECAI 2006 workshop at Riva del Garda, Italy, entitled "AI Te- niques in Health Care: Evidence-BasedGuidelinesand Protocols"wasorganized to bring together researchers from di?erent branches of arti?cial intelligence to examine cutting-edge approaches to guideline modeling and development and to consider how di?erent communities can cooperate to address the challenges of computer-based guideline development.
in the algorithmic and foundational aspects, high-level approaches as well as more applied and technology-related issues regarding tools and applications of wireless sensor networks. June 2009 Jie Wu Viktor K. Prasanna Ivan Stojmenovic Message from the Program Chair This proceedings volume includes the accepted papers of the 5th International Conference on Distributed Computing in Sensor Systems. This year we int- duced some changes in the composition of the three tracks to increase cro- disciplinary interactions. The Algorithms track was enhanced to include topics pertaining to performance analysis and network optimization and renamed "- gorithms and Analysis. " The Systems and Applications tracks, previously s- arate, were combined into a single track. And a new track was introduced on "Signal Processing and Information Theory. " DCOSS 2009 received 116 submissions for the three tracks. After a thorough reviewprocess, inwhichatleastthreereviewsweresolicitedforallpapers, atotal of 26 papers were accepted. The research contributions in this proceedings span many aspects of sensor systems, including energy-e?cient mechanisms, tracking and surveillance, activity recognition, simulation, query optimization, network coding, localization, application development, data and code dissemination. Basedonthereviews, wealsoidenti?edthebestpaperfromeachtrack, which are as follows: BestpaperintheAlgorithmsandAnalysistrack: "E?cientSensorPlacement for Surveillance Problems" by Pankaj Agarwal, Esther Ezra and Shashidhara Ganjugunte. Best paper in the Applications and Systems track: "Optimal Allocation of Time-Resources for Multihypothesis Activity-Level Detection," by Gautam Thatte, ViktorRozgic, MingLi, SabyasachiGhosh, UrbashiMitra, ShriNarayanan, Murali Annavaram and Donna Spruijt-Metz. Best paper in the Signal Processing and Information Theory track: "D- tributed Computation of Likelihood Maps for Target Tracking" by Jonathan Gallagher, Randolph Moses and Emre Ertin. |
You may like...
Computer Architecture: A Minimalist…
William F. Gilreath, Phillip A Laplante
Hardcover
R4,136
Discovery Miles 41 360
Distributed Algorithms - 3rd…
Jean-Claude Bermond, Michel Raynal
Paperback
R1,512
Discovery Miles 15 120
I. C. S. Reference Library: Types of…
International Correspondence Schools
Paperback
R744
Discovery Miles 7 440
Artificial Intelligence for Accurate…
Sandeep Kautish, Gaurav Dhiman
Hardcover
R7,962
Discovery Miles 79 620
|