![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > Computer programming
In recent years, cloud computing has gained a significant amount of attention by providing more flexible ways to store applications remotely. With software testing continuing to be an important part of the software engineering life cycle, the emergence of software testing in the cloud has the potential to change the way software testing is performed. Software Testing in the Cloud: Perspectives on an Emerging Discipline is a comprehensive collection of research by leading experts in the field providing an overview of cloud computing and current issues in software testing and system migration. Deserving the attention of researchers, practitioners, and managers, this book aims to raise awareness about this new field of study.
th I3E 2010 marked the 10 anniversary of the IFIP Conference on e-Business, e- Services, and e-Society, continuing a tradition that was invented in 1998 during the International Conference on Trends in Electronic Commerce, TrEC 1998, in Hamburg (Germany). Three years later the inaugural I3E 2001 conference was held in Zurich (Switzerland). Since then I3E has made its journey through the world: 2002 Lisbon (Portugal), 2003 Sao Paulo (Brazil), 2004 Toulouse (France), 2005 Poznan (Poland), 2006 Turku (Finland), 2007 Wuhan (China), 2008 Tokyo (Japan), and 2009 Nancy (France). I3E 2010 took place in Buenos Aires (Argentina) November 3-5, 2010. Known as "The Pearl" of South America, Buenos Aires is a cosmopolitan, colorful, and vibrant city, surprising its visitors with a vast variety of cultural and artistic performances, European architecture, and the passion for tango, coffee places, and football disc- sions. A cultural reference in Latin America, the city hosts 140 museums, 300 theaters, and 27 public libraries including the National Library. It is also the main educational center in Argentina and home of renowned universities including the U- versity of Buenos Aires, created in 1821. Besides location, the timing of I3E 2010 is th also significant--it coincided with the 200 anniversary celebration of the first local government in Argentina.
This book introduces the basic methodologies for successful data analytics. Matrix optimization and approximation are explained in detail and extensively applied to dimensionality reduction by principal component analysis and multidimensional scaling. Diffusion maps and spectral clustering are derived as powerful tools. The methodological overlap between data science and machine learning is emphasized by demonstrating how data science is used for classification as well as supervised and unsupervised learning.
In recent years Genetic Algorithms (GA) and Artificial Neural
Networks (ANN) have progressively increased in importance amongst
the techniques routinely used in chemometrics. This book contains
contributions from experts in the field is divided in two sections
(GA and ANN). In each part, tutorial chapters are included in which
the theoretical bases of each technique are expertly (but simply)
described. These are followed by application chapters in which
special emphasis will be given to the advantages of the application
of GA or ANN to that specific problem, compared to classical
techniques, and to the risks connected with its misuse.
Locally computable (NC0) functions are "simple" functions for which every bit of the output can be computed by reading a small number of bits of their input. The study of locally computable cryptography attempts to construct cryptographic functions that achieve this strong notion of simplicity and simultaneously provide a high level of security. Such constructions are highly parallelizable and they can be realized by Boolean circuits of constant depth. This book establishes, for the first time, the possibility of local implementations for many basic cryptographic primitives such as one-way functions, pseudorandom generators, encryption schemes and digital signatures. It also extends these results to other stronger notions of locality, and addresses a wide variety of fundamental questions about local cryptography. The author's related thesis was honorably mentioned (runner-up) for the ACM Dissertation Award in 2007, and this book includes some expanded sections and proofs, and notes on recent developments. The book assumes only a minimal background in computational complexity and cryptography and is therefore suitable for graduate students or researchers in related areas who are interested in parallel cryptography. It also introduces general techniques and tools which are likely to interest experts in the area.
This book presents advances in alternative swarm development that have proved to be effective in several complex problems. Swarm intelligence (SI) is a problem-solving methodology that results from the cooperation between a set of agents with similar characteristics. The study of biological entities, such as animals and insects, manifesting social behavior has resulted in several computational models of swarm intelligence. While there are numerous books addressing the most widely known swarm methods, namely ant colony algorithms and particle swarm optimization, those discussing new alternative approaches are rare. The focus on developments based on the simple modification of popular swarm methods overlooks the opportunity to discover new techniques and procedures that can be useful in solving problems formulated by the academic and industrial communities. Presenting various novel swarm methods and their practical applications, the book helps researchers, lecturers, engineers and practitioners solve their own optimization problems.
Dynamically Reconfigurable Systems is the first ever to focus on the emerging field of Dynamically Reconfigurable Computing Systems. While programmable logic and design-time configurability are well elaborated and covered by various texts, this book presents a unique overview over the state of the art and recent results for dynamic and run-time reconfigurable computing systems. Reconfigurable hardware is not only of utmost importance for large manufacturers and vendors of microelectronic devices and systems, but also a very attractive technology for smaller and medium-sized companies. Hence, Dynamically Reconfigurable Systems also addresses researchers and engineers actively working in the field and provides them with information on the newest developments and trends in dynamic and run-time reconfigurable systems.
This book offers a coherent and comprehensive approach to feature subset selection in the scope of classification problems, explaining the foundations, real application problems and the challenges of feature selection for high-dimensional data. The authors first focus on the analysis and synthesis of feature selection algorithms, presenting a comprehensive review of basic concepts and experimental results of the most well-known algorithms. They then address different real scenarios with high-dimensional data, showing the use of feature selection algorithms in different contexts with different requirements and information: microarray data, intrusion detection, tear film lipid layer classification and cost-based features. The book then delves into the scenario of big dimension, paying attention to important problems under high-dimensional spaces, such as scalability, distributed processing and real-time processing, scenarios that open up new and interesting challenges for researchers. The book is useful for practitioners, researchers and graduate students in the areas of machine learning and data mining.
The demand for large-scale dependable, systems, such as Air Traffic Management, industrial plants and space systems, is attracting efforts of many word-leading European companies and SMEs in the area, and is expected to increase in the near future. The adoption of Off-The-Shelf (OTS) items plays a key role in such a scenario. OTS items allow mastering complexity and reducing costs and time-to-market; however, achieving these goals by ensuring dependability requirements at the same time is challenging. CRITICAL STEP project establishes a strategic collaboration between academic and industrial partners, and proposes a framework to support the development of dependable, OTS-based, critical systems. The book introduces methods and tools adopted by the critical systems industry, and surveys key achievements of the CRITICAL STEP project along four directions: fault injection tools, V&V of critical systems, runtime monitoring and evaluation techniques, and security assessment.
Extensive research conducted by the Hasso Plattner Design Thinking Research Program at Stanford University in Palo Alto, California, USA, and the Hasso Plattner Institute in Potsdam, Germany, has yielded valuable insights on why and how design thinking works. The participating researchers have identified metrics, developed models, and conducted studies, which are featured in this book, and in the previous volumes of this series. This volume provides readers with tools to bridge the gap between research and practice in design thinking with varied real world examples. Several different approaches to design thinking are presented in this volume. Acquired frameworks are leveraged to understand design thinking team dynamics. The contributing authors lead the reader through new approaches and application fields and show that design thinking can tap the potential of digital technologies in a human-centered way. In a final section, new ideas in neurodesign at Stanford University and at Hasso Plattner Institute in Potsdam are elaborated upon thereby challenging the reader to consider newly developed methodologies and provide discussion of how these insights can be applied to various sectors. Special emphasis is placed on understanding the mechanisms underlying design thinking at the individual and team levels. Design thinking can be learned. It has a methodology that can be observed across multiple settings and accordingly, the reader can adopt new frameworks to modify and update existing practice. The research outcomes compiled in this book are intended to inform and provide inspiration for all those seeking to drive innovation - be they experienced design thinkers or newcomers.
A formal method is not the main engine of a development process, its contribution is to improve system dependability by motivating formalisation where useful. This book summarizes the results of the DEPLOY research project on engineering methods for dependable systems through the industrial deployment of formal methods in software development. The applications considered were in automotive, aerospace, railway, and enterprise information systems, and microprocessor design. The project introduced a formal method, Event-B, into several industrial organisations and built on the lessons learned to provide an ecosystem of better tools, documentation and support to help others to select and introduce rigorous systems engineering methods. The contributing authors report on these projects and the lessons learned. For the academic and research partners and the tool vendors, the project identified improvements required in the methods and supporting tools, while the industrial partners learned about the value of formal methods in general. A particular feature of the book is the frank assessment of the managerial and organisational challenges, the weaknesses in some current methods and supporting tools, and the ways in which they can be successfully overcome. The book will be of value to academic researchers, systems and software engineers developing critical systems, industrial managers, policymakers, and regulators.
Model-driven software development drastically alters the software development process, which is characterized by a high degree of innovation and productivity. Emerging Technologies for the Evolution and Maintenance of Software Models contains original academic work about current research and research projects related to all aspects affecting the maintenance, evolution, and reengineering (MER), as well as long-term management, of software models. The mission of this book is to present a comprehensive and central overview of new and emerging trends in software model research and to provide concrete results from ongoing developments in the field.
Each day, new applications and methods are developed for utilizing technology in the field of medical sciences, both as diagnostic tools and as methods for patients to access their medical information through their personal gadgets. However, the maximum potential for the application of new technologies within the medical field has not yet been realized. Mobile Devices and Smart Gadgets in Medical Sciences is a pivotal reference source that explores different mobile applications, tools, software, and smart gadgets and their applications within the field of healthcare. Covering a wide range of topics such as artificial intelligence, telemedicine, and oncology, this book is ideally designed for medical practitioners, mobile application developers, technology developers, software experts, computer engineers, programmers, ICT innovators, policymakers, researchers, academicians, and students.
This book describes the benefits that emerge when the fields of constraint programming and concurrency meet. On the one hand, constraints can be used in concurrency theory to increase the conciseness and the expressive power of concurrent languages from a pragmatic point of view. On the other hand, problems modeled by using constraints can be solved faster and more efficiently using a concurrent system. Both directions are explored providing two separate lines of development. Firstly the expressive power of a concurrent language is studied, namely Constraint Handling Rules, that supports constraints as a primitive construct. The features of this language which make it Turing powerful are shown. Then a framework is proposed to solve constraint problems that is intended to be deployed on a concurrent system. For the development of this framework the concurrent language Jolie following the Service Oriented paradigm is used. Based on this experience, an extension to Service Oriented Languages is also proposed in order to overcome some of their limitations and to improve the development of concurrent applications.
Information security and copyright protection are more important today than before. Digital watermarking is one of the widely used techniques used in the world in the area of information security. This book introduces a number of digital watermarking techniques and is divided into four parts. The first part introduces the importance of watermarking techniques and intelligent technology. The second part includes a number of watermarking techniques. The third part includes the hybrid watermarking techniques and the final part presents conclusions. This book is directed to students, professors, researchers and application engineers who are interested in the area of information security.
This book contains the collection of papers presented at the conference of the International Federation for Information Processing Working Group 8.2 "Information and Organizations." The conference took place during June 21-24, 2009 at the Universidade do Minho in Guimaraes, Portugal. The conference entitled "CreativeSME - The Role of IS in Leveraging the Intelligence and Creativity of SME's" attracted high-quality submissions from across the world. Each paper was reviewed by at least two reviewers in a double-blind review process. In addition to the 19 papers presented at the conference, there were five panels and four workshops, which covered a range of issues relevant to SMEs, creativity and information systems. We would like to show our appreciation of the efforts of our two invited keynote speakers, Michael Dowling of the University of Regensburg, Germany and Carlos Zorrinho, Portuguese coordinator of the Lisbon Strategy and the Technological Plan. The following organizations supported the conference through financial or other contributions and we would like to thank them for their engagement: "
The present book is the result of a three year research project which investigated the creative act of composing by means of algorithmic composition. Central to the investigation are the compositional strategies of 12 composers, which were documented through a dialogic and cyclic process of modelling and evaluating musical materials. The aesthetic premises and compositional approaches configure a rich spectrum of diverse positions, which is reflected also in the kinds of approaches and methods used. These approaches and methods include the generation and evaluation of chord sequences using genetic algorithms, the application of morphing strategies to research harmonic transformations, an automatic classification of personal preferences via machine learning, and an application of mathematical music theory to the analysis and resynthesis of musical material. The second part of the book features contributions by Sandeep Bhagwati, William Brooks, David Cope, Darla Crispin, Nicolas Donin, and Guerino Mazzola. These authors variously consider the project from different perspectives, offer independent approaches, or provide more general reflections from their respective research fields.
Zuse's textbook on software measurement provides basic principles as well as theoretical and practical guidelines for the use of numerous kinds of software measures. It is written to enable scientists, teachers, practit ioners, and students to define the basic terminology of Software Measurement and to contribute to theory building. The textbook considers, among other, qualitative and numerical models behind software measures. It explains step-by-step the importance of qualitative properties, the meaning of scale types, the foundations of the validation of measures, and the foundations of prediction models, the models behind the Function-Point method and the COCOMO model, and the qualitative assumption of object-oriented measures. For applications of software measures in practice more than two hundred software measures of the software life-cycle are described in detail (object-oriented measures included). The enclosed CD contains a selection of more than 1,600 references of literature, and a small demo version of ZD-MIS (Zuse/Drabe - Measurement Information System) is presented.
Fundamental Problems in Computing is in honor of Professor Daniel J. Rosenkrantz, a distinguished researcher in Computer Science. Professor Rosenkrantz has made seminal contributions to many subareas of Computer Science including formal languages and compilers, automata theory, algorithms, database systems, very large scale integrated systems, fault-tolerant computing and discrete dynamical systems. For many years, Professor Rosenkrantz served as the Editor-in-Chief of the Journal of the Association for Computing Machinery (JACM), a very prestigious archival journal in Computer Science. His contributions to Computer Science have earned him many awards including the Fellowship from ACM and the ACM SIGMOD Contributions Award.
Cryptographic applications, such as RSA algorithm, ElGamal cryptography, elliptic curve cryptography, Rabin cryptosystem, Diffie -Hellmann key exchange algorithm, and the Digital Signature Standard, use modular exponentiation extensively. The performance of all these applications strongly depends on the efficient implementation of modular exponentiation and modular multiplication. Since 1984, when Montgomery first introduced a method to evaluate modular multiplications, many algorithmic modifications have been done for improving the efficiency of modular multiplication, but very less work has been done on the modular exponentiation to improve the efficiency. This research monograph addresses the question- how can the performance of modular exponentiation, which is the crucial operation of many public-key cryptographic techniques, be improved? The book focuses on Energy Efficient Modular Exponentiations for Cryptographic hardware. Spread across five chapters, this well-researched text focuses in detail on the Bit Forwarding Techniques and the corresponding hardware realizations. Readers will also discover advanced performance improvement techniques based on high radix multiplication and Cryptographic hardware based on multi-core architectures.
Software Engineering Techniques Applied to Agricultural Systems presents cutting-edge software engineering techniques for designing and implementing better agricultural software systems based on the object-oriented paradigm and the Unified Modeling Language (UML). The focus is on the presentation of rigorous step-by-step approaches for modeling flexible agricultural and environmental systems, starting with a conceptual diagram representing elements of the system and their relationships. Furthermore, diagrams such as sequential and collaboration diagrams are used to explain the dynamic and static aspects of the software system. This second edition includes: a new chapter on Object Constraint Language (OCL), a new section dedicated to the Model-VIEW-Controller (MVC) design pattern, new chapters presenting details of two MDA-based tools - the Virtual Enterprise and Olivia Nova and a new chapter with exercises on conceptual modeling. It may be highly useful to undergraduate and graduate students as the first edition has proven to be a useful supplementary textbook for courses in mathematical programming in agriculture, ecology, information technology, agricultural operations research methods, agronomy and soil science and applied mathematical modeling. The book has broad appeal for anyone involved in software development projects in agriculture and to researchers in general who are interested in modeling complex systems. From the reviews of the first edition: "The book will be useful for those interested in gaining a quick understanding of current software development techniques and how they are applied in practice... this is a good introductory text on the application of OOAD, UML and design patters to the creation of agricultural systems. It is technically sound and well written." -Computing Reviews, September 2006
NewInternetdevelopmentsposegreaterandgreaterprivacydilemmas. Inthe- formation Society, the need for individuals to protect their autonomy and retain control over their personal information is becoming more and more important. Today, informationandcommunicationtechnologies-andthepeopleresponsible for making decisions about them, designing, and implementing them-scarcely consider those requirements, thereby potentially putting individuals' privacy at risk. The increasingly collaborative character of the Internet enables anyone to compose services and contribute and distribute information. It may become hard for individuals to manage and control information that concerns them and particularly how to eliminate outdated or unwanted personal information, thus leavingpersonalhistoriesexposedpermanently. Theseactivitiesraisesubstantial new challenges for personal privacy at the technical, social, ethical, regulatory, and legal levels: How can privacy in emerging Internet applications such as c- laborative scenarios and virtual communities be protected? What frameworks and technical tools could be utilized to maintain life-long privacy? DuringSeptember3-10,2009, IFIP(InternationalFederationforInformation Processing)workinggroups9. 2 (Social Accountability),9. 6/11. 7(IT Misuseand theLaw),11. 4(NetworkSecurity)and11. 6(IdentityManagement)heldtheir5th InternationalSummerSchoolincooperationwiththeEUFP7integratedproject PrimeLife in Sophia Antipolis and Nice, France. The focus of the event was on privacy and identity managementfor emerging Internet applications throughout a person's lifetime. The aim of the IFIP Summer Schools has been to encourage young a- demic and industry entrants to share their own ideas about privacy and identity management and to build up collegial relationships with others. As such, the Summer Schools havebeen introducing participants to the social implications of information technology through the process of informed discussion.
The three-volume set IFIP AICT 368-370 constitutes the refereed post-conference proceedings of the 5th IFIP TC 5, SIG 5.1 International Conference on Computer and Computing Technologies in Agriculture, CCTA 2011, held in Beijing, China, in October 2011. The 189 revised papers presented were carefully selected from numerous submissions. They cover a wide range of interesting theories and applications of information technology in agriculture, including simulation models and decision-support systems for agricultural production, agricultural product quality testing, traceability and e-commerce technology, the application of information and communication technology in agriculture, and universal information service technology and service systems development in rural areas. The 59 papers included in the third volume focus on simulation, optimization, monitoring, and control technology. |
![]() ![]() You may like...
Hardware Accelerators in Data Centers
Christoforos Kachris, Babak Falsafi, …
Hardcover
R4,132
Discovery Miles 41 320
|