![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Applications of computing > Artificial intelligence > Knowledge-based systems / expert systems
The Second International Conference on High-Performance Computing and Appli- tions (HPCA 2009) was a follow-up event of the successful HPCA 2004. It was held in Shanghai, a beautiful, active, and modern city in China, August 10-12, 2009. It served as a forum to present current work by researchers and software developers from around the world as well as to highlight activities in the high-performance c- puting area. It aimed to bring together research scientists, application pioneers, and software developers to discuss problems and solutions and to identify new issues in this area. This conference emphasized the development and study of novel approaches for high-performance computing, the design and analysis of high-performance - merical algorithms, and their scientific, engineering, and industrial applications. It offered the conference participants a great opportunity to exchange the latest research results, heighten international collaboration, and discuss future research ideas in HPCA. In addition to 24 invited presentations, the conference received over 300 contr- uted submissions from over ten countries and regions worldwide, about 70 of which were accepted for presentation at HPCA 2009. The conference proceedings contain some of the invited presentations and contributed submissions, and cover such research areas of interest as numerical algorithms and solutions, high-performance and grid c- puting, novel approaches to high-performance computing, massive data storage and processing, hardware acceleration, and their wide applications.
Despite its importance, the role of HdS is most often underestimated and the topic is not well represented in literature and education. To address this, Hardware-dependent Software brings together experts from different HdS areas. By providing a comprehensive overview of general HdS principles, tools, and applications, this book provides adequate insight into the current technology and upcoming developments in the domain of HdS. The reader will find an interesting text book with self-contained introductions to the principles of Real-Time Operating Systems (RTOS), the emerging BIOS successor UEFI, and the Hardware Abstraction Layer (HAL). Other chapters cover industrial applications, verification, and tool environments. Tool introductions cover the application of tools in the ASIP software tool chain (i.e. Tensilica) and the generation of drivers and OS components from C-based languages. Applications focus on telecommunication and automotive systems.
The building blocks of today's and future embedded systems are complex intellectual property components, or cores, many of which are programmable processors. Traditionally, these embedded processors mostly have been pro grammed in assembly languages due to efficiency reasons. This implies time consuming programming, extensive debugging, and low code portability. The requirements of short time-to-market and dependability of embedded systems are obviously much better met by using high-level language (e.g. C) compil ers instead of assembly. However, the use of C compilers frequently incurs a code quality overhead as compared to manually written assembly programs. Due to the need for efficient embedded systems, this overhead must be very low in order to make compilers useful in practice. In turn, this requires new compiler techniques that take the specific constraints in embedded system de sign into account. An example are the specialized architectures of recent DSP and multimedia processors, which are not yet sufficiently exploited by existing compilers."
To the hard-pressed systems designer this book will come as a godsend. It is a hands-on guide to the many ways in which processor-based systems are designed to allow low power devices. Covering a huge range of topics, and co-authored by some of the field 's top practitioners, the book provides a good starting point for engineers in the area, and to research students embarking upon work on embedded systems and architectures.
There is a tremendous interest in the design and applications of agents in virtually every area including avionics, business, internet, engineering, health sciences and management. There is no agreed one definition of an agent but we can define an agent as a computer program that autonomously or semi-autonomously acts on behalf of the user. In the last five years transition of intelligent systems research in general and agent based research in particular from a laboratory environment into the real world has resulted in the emergence of several phenomenon. These trends can be placed in three catego ries, namely, humanization, architectures and learning and adapta tion. These phenomena are distinct from the traditional logic centered approach associated with the agent paradigm. Humaniza tion of agents can be understood among other aspects, in terms of the semantics quality of design of agents. The need to humanize agents is to allow practitioners and users to make more effective use of this technology. It relates to the semantic quality of the agent design. Further, context-awareness is another aspect which has as sumed importance in the light of ubiquitous computing and ambi ent intelligence. The widespread and varied use of agents on the other hand has cre ated a need for agent-based software development frameworks and design patterns as well architectures for situated interaction, nego tiation, e-commerce, e-business and informational retrieval. Fi- vi Preface nally, traditional agent designs did not incorporate human-like abilities of learning and adaptation."
Computerarchitecturepresentlyfacesanunprecedentedrevolution: Thestep from monolithic processors towards multi-core ICs, motivated by the ever - creasingneedforpowerandenergyef ciencyinnanoelectronics. Whetheryou prefer to call it MPSoC (multi-processor system-on-chip) or CMP (chip mul- processor), no doubt this revolution affects large domains of both computer science and electronics, and it poses many new interdisciplinary challenges. For instance, ef cient programming models and tools for MPSoC are largely an open issue: "Multi-core platforms are a reality - but where is the software support" (R. Lauwereins, IMEC). Solving it will require enormous research efforts as well as the education of a whole new breed of software engineers that bring the results from universities into industrial practice. Atthesametime, thedesignofcomplexMPSoCarchitecturesisanextremely time-consuming task, particularly in the wireless and multimedia application domains, where heterogeneous architectures are predominant. Due to the - ploding NRE and mask costs most companies are now following a platform approach: Invest a large (but one-time) design effort into a proper core - chitecture, and create easy-to-design derivatives for new standards or product features. Needless to say, only the most ef cient MPSoC platforms have a real chance to enjoy a multi-year lifetime on the highly competitive semiconductor market for embedded systems.
This book constitutes the refereed proceedings of the 4th International Workshop on Self-Organizing Systems, IWSOS 2009, held in Zurich, Switzerland, in December 2009. The 14 revised full papers and 13 revised short papers presented were carefully selected from the 34 full and 27 short paper submissions. The papers are organized in topical sections on ad hoc and sensor networks; services, storage, and internet routing; peer-to-peer systems; theory and general approaches; overlay networks; peer-to-peer systems and internet routing; wireless networks; and network topics.
This volume contains the extended papers selected for presentation at the ninth edition of the International Symposium on Web & Wireless Geographical Information Systems 2 (WGIS 2009) hosted by the National Centre for Geocomputation in NUI Maynooth 2 (Ireland). WGIS 2009 was the ninth in a series of successful events beginning with Kyoto 2001, and alternating locations between East Asia and Europe. We invited s- missions that provided an up-to-date review of advances in theoretical, technical, and 2 practical issues of W GIS and Intelligent GeoMedia. Reports on ongoing implemen- tions and real-world applications research were particularly welcome at this symposium. 2 Now in its ninth year, the scope of W GIS has expanded to include continuing - vances in wireless and Internet technologies that generate ever increasing interest in the diffusion, usage, and processing of geo-referenced data of all types - geomedia. Spatially aware wireless and Internet devices offer new ways of accessing and anal- ing geo-spatial information in both real-world and virtual spaces. Consequently, new challenges and opportunities are provided that expand the traditional GIS research scope into the realm of intelligent media - including geomedia with context-aware behaviors for self-adaptive use and delivery. Our common aim is research-based innovation that increases the ease of creating, delivering, and using geomedia across different platforms and application domains that continue to have dramatic effect on today's society.
This book constitutes the joint refereed proceedings of the Third International Workshop on Communication Technologies for Vehicles, Nets4Cars 2011and the First International Workshop on Communication Technologies for Vehicles in the Railway Transportation, Nets4Trains 2011, held in Oberpfaffenhofen, Germany, in March 2011. The 7 full papers of the rail track and 12 full papers of the road track presented together with a keynote were carefully reviewed and selected from 13 and 21 submissions respectively. They provide an overview over the latest technologies and research in the field of intra- and inter-vehicle communication and present original research results in areas relating to communication protocols and standards, mobility and traffic models, experimental and field operational testing, and performance analysis.
As future generation information technology (FGIT) becomes specialized and fr- mented, it is easy to lose sight that many topics in FGIT have common threads and, because of this, advances in one discipline may be transmitted to others. Presentation of recent results obtained in different disciplines encourages this interchange for the advancement of FGIT as a whole. Of particular interest are hybrid solutions that c- bine ideas taken from multiple disciplines in order to achieve something more signi- cant than the sum of the individual parts. Through such hybrid philosophy, a new principle can be discovered, which has the propensity to propagate throughout mul- faceted disciplines. FGIT 2009 was the first mega-conference that attempted to follow the above idea of hybridization in FGIT in a form of multiple events related to particular disciplines of IT, conducted by separate scientific committees, but coordinated in order to expose the most important contributions. It included the following international conferences: Advanced Software Engineering and Its Applications (ASEA), Bio-Science and Bio-Technology (BSBT), Control and Automation (CA), Database Theory and Application (DTA), D- aster Recovery and Business Continuity (DRBC; published independently), Future G- eration Communication and Networking (FGCN) that was combined with Advanced Communication and Networking (ACN), Grid and Distributed Computing (GDC), M- timedia, Computer Graphics and Broadcasting (MulGraB), Security Technology (SecTech), Signal Processing, Image Processing and Pattern Recognition (SIP), and- and e-Service, Science and Technology (UNESST).
LION 3, the Third International Conference on Learning and Intelligent Op- mizatioN, was held during January 14-18 in Trento, Italy. The LION series of conferences provides a platform for researchers who are interested in the int- section of e?cient optimization techniques and learning. It is aimed at exploring the boundaries and uncharted territories between machine learning, arti?cial intelligence, mathematical programming and algorithms for hard optimization problems. The considerable interest in the topics covered by LION was re?ected by the overwhelming number of 86 submissions, which almost doubled the 48 subm- sions received for LION's second edition in December 2007. As in the ?rst two editions, the submissions to LION 3 could be in three formats: (a) original novel and unpublished work for publication in the post-conference proceedings, (b) extended abstracts of work-in-progressor a position statement, and (c) recently submitted or published journal articles for oral presentations. The 86 subm- sions received include 72, ten, and four articles for categories (a), (b), and (c), respectively.
The 2009 International Conference on Artificial Intelligence and Computational Int- ligence (AICI 2009) was held during November 7-8, 2009 in Shanghai, China. The technical program of the conference reflects the tremendous growth in the fields of artificial intelligence and computational intelligence with contributions from a large number of participants around the world. AICI 2009 received 1,203 submissions from 20 countries and regions. After rig- ous reviews, 79 high-quality papers were selected for this volume, representing an acceptance rate of 6.6%. These selected papers cover many new developments and their applications in the fields of artificial intelligence and computational intelligence. Their publications reflect a sustainable interest from the wide academic community worldwide in tirelessly pursuing new solutions through effective utilizations of arti- cial intelligence and computational intelligence to real-world problems. We would like to specially thank all the committee members and reviewers, without whose timely help it would have been impossible to review all the submitted papers to assemble this program. We also would like take this opportunity to express our heartfelt appreciation for all those who worked together in organizing this conference, establi- ing the technical programs and running the conference meetings. We greatly appreciate the authors, speakers, invited session organizers, session Chairs, and others who made this conference possible. Lastly, we would like to express our gratitude to the Shanghai University of Electric Power for the sponsorship and support of the conference.
As future generation information technology (FGIT) becomes specialized and fr- mented, it is easy to lose sight that many topics in FGIT have common threads and, because of this, advances in one discipline may be transmitted to others. Presentation of recent results obtained in different disciplines encourages this interchange for the advancement of FGIT as a whole. Of particular interest are hybrid solutions that c- bine ideas taken from multiple disciplines in order to achieve something more signi- cant than the sum of the individual parts. Through such hybrid philosophy, a new principle can be discovered, which has the propensity to propagate throughout mul- faceted disciplines. FGIT 2009 was the first mega-conference that attempted to follow the above idea of hybridization in FGIT in a form of multiple events related to particular disciplines of IT, conducted by separate scientific committees, but coordinated in order to expose the most important contributions. It included the following international conferences: Advanced Software Engineering and Its Applications (ASEA), Bio-Science and Bio-Technology (BSBT), Control and Automation (CA), Database Theory and Application (DTA), D- aster Recovery and Business Continuity (DRBC; published independently), Future G- eration Communication and Networking (FGCN) that was combined with Advanced Communication and Networking (ACN), Grid and Distributed Computing (GDC), M- timedia, Computer Graphics and Broadcasting (MulGraB), Security Technology (SecTech), Signal Processing, Image Processing and Pattern Recognition (SIP), and- and e-Service, Science and Technology (UNESST).
This volume contains the proceedings of the Second International Workshop on Mobile Entity Localization and Tracking in GPS-less Environments (MELT 2009), held in Orlando, Florida on September 30, 2009 in conjunction with the 11th International Conference on Ubiquitous Computing (Ubicomp 2009). MELT provides a forum for the presentation of state-of-the-art technologies in mobile localization and tracking and novel applications of location-based s- vices. MELT 2009 continued the success of the ?rst workshop in the series (MELT 2008), which was held is San Francisco, California on September 19, 2008 in conjunction with Mobicom. Location-awareness is a key component for achieving context-awareness. - cent years have witnessed an increasing trend towards location-based services and applications. In most cases, however, location information is limited by the accessibility to GPS, which is unavailable for indoor or underground fac- ities and unreliable in urban environments. Much research has been done, in both the sensor network community and the ubiquitous computing community, to provide techniques for localization and tracking in GPS-less environments. Novel applications based on ad-hoc localization and real-time tracking of - bile entities are growing as a result of these technologies. MELT brings together leaders from both the academic and industrial research communities to discuss challenging and open problems, to evaluate pros and cons of various approaches, to bridge the gap between theory and applications, and to envision new research opportunities.
These proceedings contain the papers presented at VoteID 2009, the Second - ternationalConferenceonE-votingandIdentity.TheconferencewasheldinL- embourgduring September 7-8,2009, hostedbythe Universityof Luxembourg. VoteID 2009 built on the success of the 2007 edition held in Bochum. Events have moved on dramatically in the intervening two years: at the time of writing, people are in the streets of Tehran protesting against the claimed outcome of the June12thpresidentialelectionin Iran.Banners bearingthe words"Whereis my vote?" bear testimony to the strength of feeling and the need for elections to be trusted. These events show that the search for high-assurance voting is not a purely academic pursuit but one of very real importance. We hope that VoteID 2009 will help contribute to our understanding of the foundations of democracy. TheProgramCommitteeselected11papersforpresentationattheconference out of a total of 24 submissions. Each submission was reviewed by at least four Program Committee members. The EasyChair conference management system proved instrumental in the reviewing process as well as in the preparation of these proceedings. The selected papers cover a wide range of aspects of voting: proposals for high-assurancevotingsystems, evaluationofexistingsystems, assessmentofp- lic response to electronic voting and legal aspects. The program also included a keynote by Mark Ryan.
During the 1980s and early 1990s there was signi?cant work in the design and implementation of hardware neurocomputers. Nevertheless, most of these efforts may be judged to have been unsuccessful: at no time have have ha- ware neurocomputers been in wide use. This lack of success may be largely attributed to the fact that earlier work was almost entirely aimed at developing custom neurocomputers, based on ASIC technology, but for such niche - eas this technology was never suf?ciently developed or competitive enough to justify large-scale adoption. On the other hand, gate-arrays of the period m- tioned were never large enough nor fast enough for serious arti?cial-neur- network (ANN) applications. But technology has now improved: the capacity and performance of current FPGAs are such that they present a much more realistic alternative. Consequently neurocomputers based on FPGAs are now a much more practical proposition than they have been in the past. This book summarizes some work towards this goal and consists of 12 papers that were selected, after review, from a number of submissions. The book is nominally divided into three parts: Chapters 1 through 4 deal with foundational issues; Chapters 5 through 11 deal with a variety of implementations; and Chapter 12 looks at the lessons learned from a large-scale project and also reconsiders design issues in light of current and future technology.
Introduction The goal of this book is to introduce XML to a bioinformatics audience. It does so by introducing the fundamentals of XML, Document Type De?nitions (DTDs), XML Namespaces, XML Schema, and XML parsing, and illustrating these concepts with speci?c bioinformatics case studies. The book does not assume any previous knowledge of XML and is geared toward those who want a solid introduction to fundamental XML concepts. The book is divided into nine chapters: Chapter 1: Introduction to XML for Bioinformatics. This chapter provides an introduction to XML and describes the use of XML in biological data exchange. A bird's-eye view of our ?rst case study, the Distributed Annotation System (DAS), is provided and we examine a sample DAS XML document. The chapter concludes with a discussion of the pros and cons of using XML in bioinformatic applications. Chapter 2: Fundamentals of XML and BSML. This chapter introduces the fundamental concepts of XML and the Bioinformatic Sequence Markup Language (BSML). We explore the origins of XML, de?ne basic rules for XML document structure, and introduce XML Na- spaces. We also explore several sample BSML documents and visualize these documents in the TM Rescentris Genomic Workspace Viewer.
Model Based Fuzzy Control uses a given conventional or fuzzy open loop model of the plant under control to derive the set of fuzzy rules for the fuzzy controller. Of central interest are the stability, performance, and robustness of the resulting closed loop system. The major objective of model based fuzzy control is to use the full range of linear and nonlinear design and analysis methods to design such fuzzy controllers with better stability, performance, and robustness properties than non-fuzzy controllers designed using the same techniques. This objective has already been achieved for fuzzy sliding mode controllers and fuzzy gain schedulers - the main topics of this book. The primary aim of the book is to serve as a guide for the practitioner and to provide introductory material for courses in control theory.
This book constitutes the refereed proceedings of the 23nd International Symposium on Distributed Computing, DISC 2009, held in Elche, Spain, in September 2009. The 33 revised full papers, selected from 121 submissions, are presented together with 15 brief announcements of ongoing works; all of them were carefully reviewed and selected for inclusion in the book. The papers address all aspects of distributed computing, and were organized in topical sections on Michel Raynal and Shmuel Zaks 60th birthday symposium, award nominees, transactional memory, shared memory, distributed and local graph algorithms, modeling issues, game theory, failure detectors, from theory to practice, graph algorithms and routing, consensus and byzantine agreement and radio networks.
Networks on Chip presents a variety of topics, problems and approaches with the common theme to systematically organize the on-chip communication in the form of a regular, shared communication network on chip, an NoC for short. As the number of processor cores and IP blocks integrated on a single chip is steadily growing, a systematic approach to design the communication infrastructure becomes necessary. Different variants of packed switched on-chip networks have been proposed by several groups during the past two years. This book summarizes the state of the art of these efforts and discusses the major issues from the physical integration to architecture to operating systems and application interfaces. It also provides a guideline and vision about the direction this field is moving to. Moreover, the book outlines the consequences of adopting design platforms based on packet switched network. The consequences may in fact be far reaching because many of the topics of distributed systems, distributed real-time systems, fault tolerant systems, parallel computer architecture, parallel programming as well as traditional system-on-chip issues will appear relevant but within the constraints of a single chip VLSI implementation. The book is organized in three parts. The first deals with system design and methodology issues. The second presents problems and solutions concerning the hardware and the basic communication infrastructure. Finally, the third part covers operating system, embedded software and application. However, communication from the physical to the application level is a central theme throughout the book. The book serves as an excellent reference source and may be used as a text for advanced courses on the subject.
Embedded systems take over complex control and data processing tasks in diverse application ?elds such as automotive, avionics, consumer products, and telec- munications. They are the primary driver for improving overall system safety, ef?ciency, and comfort. The demand for further improvement in these aspects can only be satis?ed by designing embedded systems of increasing complexity, which in turn necessitates the development of new system design methodologies based on speci?cation, design, and veri?cation languages. The objective of the book at hand is to provide researchers and designers with an overview of current research trends, results, and application experiences in c- puter languages for embedded systems. The book builds upon the most relevant contributions to the 2008 conference Forum on Design Languages (FDL), the p- mier international conference specializing in this ?eld. These contributions have been selected based on the results of reviews provided by leading experts from - search and industry. In many cases, the authors have improved their original work by adding breadth, depth, or explanation.
The International Workshop on "Human Interaction with Machines" is the sixth in a successful series of workshops that were established by Shanghai Jiao Tong University and Technische Universitat Berlin. The goal of those workshops is to bring together researchers from both universities in order to present research results to an international community. The series of workshops started in 1990 with the International Workshop on "Artificial Intelligence" and was continued with the International Workshop on "Advanced Software Technology" in 1994. Both workshops have been hosted by Shanghai Jiaotong University. In 1998 the third wo- shop took place in Berlin. This International Workshop on "Communi- tion Based Systems" was essentially based on results from the Graduiertenkolleg on Communication Based Systems that was funded by the German Research Society (DFG) from 1991 to 2000. The fourth Int- national Workshop on "Robotics and its Applications" was held in Sha- hai in 2000. The fifth International Workshop on "The Internet Challenge: Technology and Applications" was hosted by TU Berlin in 2002."
ESL or "Electronic System Level" is a buzz word these days, in the electronic design automation (EDA) industry, in design houses, and in the academia. Even though numerous trade magazine articles have been written, quite a few books have been published that have attempted to de?ne ESL, it is still not clear what exactly it entails. However, what seems clear to every one is that the "Register Transfer Level" (RTL) languages are not adequate any more to be the design entry point for today's and tomorrow's complex electronic system design. There are multiple reasons for such thoughts. First, the c- tinued progression of the miniaturization of the silicon technology has led to the ability of putting almost a billion transistors on a single chip. Second, applications are becoming more and more complex, and integrated with c- munication, control, ubiquitous and pervasive computing, and hence the need for ever faster, ever more reliable, and more robust electronic systems is pu- ing designers towards a productivity demand that is not sustainable without a fundamental change in the design methodologies. Also, the hardware and software functionalities are getting interchangeable and ability to model and design both in the same manner is gaining importance. Given this context, we assume that any methodology that allows us to model an entire electronic system from a system perspective, rather than just hardware with discrete-event or cycle based semantics is an ESL method- ogy of some kind.
This year's edition of the international federated conferences on Distributed Computing Techniques took place in Lisbon during June 9-11, 2009. It was hosted by the Faculty of Sciences of the University of Lisbon and formally or- nized by Instituto de Telecomunica, c oes. The DisCoTecconferences jointly coverthe completespectrum ofdistributed computing subjects ranging from theoretical foundations to formal speci?cation techniques to practical considerations. The event consisted of the 11th Inter- tional Conference on Coordination Models and Languages(COORDINATION), the 9th IFIP International Conference on Distributed Applications and Inter- erable Systems (DAIS), and the IFIP International Conference on Formal Te- niquesforDistributedSystems(FMOODS/FORTE).COORDINATIONfocused on languages, models, and architectures for concurrentand distributed software. DAIS emphasized methods, techniques, and system infrastructures needed to design, build, operate, evaluate, and manage modern distributed applications in any kind of application environment and scenario. FMOODS (11th Formal MethodsforOpenObject-BasedDistributedSystems)joinedforceswithFORTE (29thFormalTechniquesfor NetworkedandDistributed Systems), creatinga - rum for fundamental researchon theory and applications of distributed systems."
This book constitutes the refereed proceedings of the 7th International Conference On Smart Homes and and Health Telematics, ICOST 2009, held in Tours, France, in July 2009. The 27 revised full papers and 20 short papers presented were carefully reviewed and selected from numerous submissions. The papers are organized in topical sections on cognitive assistance and chronic diseases management; ambient living systems; service continuity and context awareness; user modeling and human-machine interaction; ambient intelligence modeling and privacy issues, human behavior and activities monitoring. |
You may like...
Confident Coding - Learn How to Code and…
Rob Percival, Darren Woods
Hardcover
R1,130
Discovery Miles 11 300
Brill's Companion to the Reception of…
Irene Caiazzo, Constantinos Macris, …
Hardcover
R4,854
Discovery Miles 48 540
Orbital Mechanics and Formation Flying…
Pedro A. Capo-Lugo, P.M. Bainum
Hardcover
R4,342
Discovery Miles 43 420
Multimedia Cloud Computing Systems
Mohsen Amini Salehi, Xiangbo Li
Hardcover
R3,986
Discovery Miles 39 860
|