![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > Applications of computing > General
The instant access that hackers have to the latest tools and techniques demands that companies become more aggressive in defending the security of their networks. Conducting a network vulnerability assessment, a self-induced hack attack, identifies the network components and faults in policies, and procedures that expose a company to the damage caused by malicious network intruders.
Combinatorial optimization is a multidisciplinary scientific area, lying in the interface of three major scientific domains: mathematics, theoretical computer science and management. The three volumes of the Combinatorial Optimization series aim to cover a wide range of topics in this area. These topics also deal with fundamental notions and approaches as with several classical applications of combinatorial optimization. Concepts of Combinatorial Optimization, is divided into three parts: - On the complexity of combinatorial optimization problems, presenting basics about worst-case and randomized complexity; - Classical solution methods, presenting the two most-known methods for solving hard combinatorial optimization problems, that are Branch-and-Bound and Dynamic Programming; - Elements from mathematical programming, presenting fundamentals from mathematical programming based methods that are in the heart of Operations Research since the origins of this field.
This book provides a theoretical and application oriented analysis of deterministic scheduling problems arising in computer and manufacturing environments. In such systems processors (machines) and possibly other resources are to be allocated among tasks in such a way that certain scheduling objectives are met. Various scheduling problems are discussed where different problem parameters such as task processing times, urgency weights, arrival times, deadlines, precedence constraints, and processor speed factor are involved. Polynomial and exponential time optimization algorithms as well as approximation and heuristic approaches (including tabu search, simulated annealing, genetic algorithms, and ejection chains) are presented and discussed. Moreover, resource-constrained, imprecise computation, flexible flow shop and dynamic job shop scheduling, as well as flexible manufacturing systems, are considered.
Current practice dictates the separation of the hardware and software development paths early in the design cycle. These paths remain independent with very little interaction occurring between them until system integration. In particular, hardware is often specified without fully appreciating the computational requirements of the software. Also, software development does not influence hardware development and does not track changes made during the hardware design phase. Thus, the ability to explore hardware/software tradeoffs is restricted, such as the movement of functionality from the software domain to the hardware domain (and vice-versa) or the modification of the hardware/software interface. As a result, problems that are encountered during system integration may require modification of the software and/or hardware, resulting in potentially significant cost increases and schedule overruns. To address the problems described above, a cooperative design approach, one that utilizes a unified view of hardware and software, is described. This approach is called hardware/software codesign. The Codesign of Embedded Systems develops several fundamental hardware/software codesign concepts and a methodology that supports them. A unified representation, referred to as a decomposition graph, is presented which can be used to describe hardware or software using either functional abstractions or data abstractions. Using a unified representation based on functional abstractions, an abstract hardware/software model has been implemented in a common simulation environment called ADEPT (Advanced Design Environment Prototyping Tool). This model permits early hardware/software evaluation and tradeoff exploration. Techniques have been developed which support the identification of software bottlenecks and the evaluation of design alternatives with respect to multiple metrics. The application of the model is demonstrated on several examples. A unified representation based on data abstractions is also explored. This work leads to investigations regarding the application of object-oriented techniques to hardware design. The Codesign of Embedded Systems: A Unified Hardware/Software Representation describes a novel approach to a topic of immense importance to CAD researchers and designers alike.
Value-Driven IT Management explains how huge sums are wasted by
companies (and governments) on poorly aligned, poorly justified and
poorly managed IT projects based on 'wishful thinking' cost and
benefit assumptions and that even 'successful' projects rarely seem
to realise the benefits promised.
This collection of excellent papers cultivates a new perspective on agent-based social system sciences, gaming simulation, and their hybridization. Most of the papers included here were presented in the special session titled Agent-Based Modeling Meets Gaming Simulation at ISAGA2003, the 34th annual conference of the International Simulation and Gaming Association (ISAGA) at Kazusa Akademia Park in Kisarazu, Chiba, Japan, August 25-29, 2003. This post-proceedings was supported by the twenty-?rst century COE (Centers of Excellence) program Creation of Agent-Based Social Systems Sciences (ABSSS), established at the Tokyo Institute of Technology in 2004. The present volume comprises papers submitted to the special session of ISAGA2003 and provides a good example of the diverse scope and standard of research achieved in simulation and gaming today. The theme of the special session at ISAGA2003 was Agent-Based Modeling Meets Gaming Simulation. Nowadays, agent-based simulation is becoming very popular for modeling and solving complex social phenomena. It is also used to arrive at practical solutions to social problems. At the same time, however, the validity of simulation does not exist in the magni?cence of the model. R. Axelrod stresses the simplicity of the agent-based simulation model through the "Keep it simple, stupid" (KISS) principle: As an ideal, simple modeling is essential.
Interactive Whiteboards for Education: Theory, Research and Practice emphasizes the importance of professional development, credible educational research, and dialogue between teachers, administrators, policymakers and learners. This book intends to guide and inform the process of technology integration in education, introducing valuable case studies for educators interested in present and future IWB technology.
As we begin to venture outside of lockdown, photographers of all skill levels will be eager to capture the world around them. In Decisive Moments, Andy Hall combines his photographic and teaching experience by putting together a thirty year retrospective collection of stunning images, each of which has a key learning feature for photographers to reflect on. Throughout, Hall will teach and inspire photographers of all abilities from beginners to experienced practitioners and will help them to identify photographic opportunities and make successful images consistently. The advice is applicable to users of all types of cameras from professional DSLRs to smartphones. This is a must-have book not only for photographers who want to achieve their full potential but for people who simply enjoy the visual world around them.
Choose the right hardware and software for your school!This unique book is the first systematic work on evaluating and assessing educational information technology. Here you?ll find specific strategies, best practices, and techniques to help you choose the educational technology that is most appropriate for your institution. Evaluation and Assessment in Educational Information Technology will show you how to measure the effects of information technology on teaching and learning, help you determine the extent of technological integration into the curriculum that is best for your school, and point you toward the most effective ways to teach students and faculty to use new technology.Evaluation and Assessment in Educational Information Technology presents: a summary of the last ten years of assessment instrument development seven well-validated instruments that gauge attitudes, beliefs, skills, competencies, and technology integration proficiencies two content analysis instruments for analyzing teacher-student interaction patterns in a distance learning setting an examination of the best uses of computerized testing--as opposed to conventional tests, as used in local settings, to meet daily instructional needs, in online delivery programs, in public domain software, and available commercial and shareware options successful pedagogical and assessment strategies for use in online settings a four-dimensional model to assess student learning in instructional technology courses three models for assessing the significance of information technology in education from a teacher's perspective an incisive look at Michigan's newly formed Consortium of Outstanding Achievement in Teaching withTechnology (COATT) ways to use electronic portfolios for teaching/learning performance assessment and much more!
This book addresses issues associated with the interface of computing, optimisation, econometrics and financial modeling, emphasizing computational optimisation methods and techniques. The first part addresses optimisation problems and decision modeling, plus applications of supply chain and worst-case modeling and advances in methodological aspects of optimisation techniques. The second part covers optimisation heuristics, filtering, signal extraction and time series models. The final part discusses optimisation in portfolio selection and real option modeling.
What Makes this Book Unique? No crystal ball is required to safely predict, that in the future - even more than in the past - mastered innovativeness will be a primary criterion distinguishing s- cessful from unsuccessful companies. At the latest since Michael Porter's study on the competitiveness of nations, the same criterion holds even for the evaluation of entire countries and national economies. Despite the innumerable number of p- lications and recommendations on innovation, competitive innovativeness is still a rare competency. The latest publication of UNICE - the European Industry - ganization representing 20 million large, midsize and small companies - speaks a clear language: Europe qualifies to roughly 60% (70%) of the innovation strength of the US (Japan). The record unemployment in many EU countries does not c- tradict this message. A main reason may be given by the fact that becoming an innovative organi- tion means increased openness towards the new and more tolerance towards risks and failures, both challenging the inherently difficult management art of cultural change. Further, lacking innovativeness is often related to legal and fiscal barriers which rather hinder than foster innovative activities. Yet another reason to explain Europe's notorious innovation gap refers to insufficient financial R&D resources on the company as well as on the national level. As a result, for example, hi- ranking decisions on the level of the European Commission are taken to increase R&D expenditures in the European Union from roughly 2% to 3% of GNP.
Until now, those preparing to take the Certified Information Systems Security Professional (CISSP) examination were not afforded the luxury of studying a single, easy-to-use manual. Written by ten subject matter experts (SMEs) - all CISSPs - this test prep book allows CISSP candidates to test their current knowledge in each of the ten security domains that make up the Common Body of Knowledge (CBK) from which the CISSP examination is based on. The Total CISSP Exam Prep Book: Practice Questions, Answers, and Test Taking Tips and Techniques provides an outline of the subjects, topics, and sub-topics contained within each domain in the CBK, and with it you can readily identify terms and concepts that you will need to know for the exam.
The Web is the nervous system of information society. As such, it has a pervasive influence on our daily lives. And yet, in some ways the Web does not have a high MIQ (Machine IQ). What can be done to enhance it? This is the leitmotif of "Intelligent Exploration of the Web," (lEW)--a collection of articles co-edited by Drs. Szczepaniak, Segovia, Kacprzyk and, to a small degree, myself. The articles that comprise lEW address many basic problems ranging from structure analysis of Internet documents and Web dialogue management to intelligent Web agents for extraction of information, and bootstrapping an ontology-based information extraction system. Among the basic problems, one that stands out in importance is the problem of search. Existing search engines have many remarkable capabilities. But what is not among them is the deduction capability--the capability to answer a query by drawing on information which resides in various parts of the knowledge base. An example of a query might be "How many Ph.D. degrees in computer science were granted by European universities in 1996?" No existing search engine is capable of dealing with queries of comparable or even much lower complexity. Basically, what we would like to do is to add deduction capability to a search engine, with the aim of transforming it into a question-answering system, or a QI A system, for short. This is a problem that is of major importance and a challenge that is hard to meet.
Hardware Software Co-Design of a Multimedia SOC Platform is one of the first of its kinds to provide a comprehensive overview of the design and implementation of the hardware and software of an SoC platform for multimedia applications. Topics covered in this book range from system level design methodology, multimedia algorithm implementation, a sub-word parallel, single-instruction-multiple data (SIMD) processor design, and its virtual platform implementation, to the development of an SIMD parallel compiler as well as a real-time operating system (RTOS). Hardware Software Co-Design of a Multimedia SOC Platform is written for practitioner engineers and technical managers who want to gain first hand knowledge about the hardware-software design process of an SoC platform. It offers both tutorial-like details to help readers become familiar with a diverse range of subjects, and in-depth analysis for advanced readers to pursue further.
Critical Infrastructure Protection II describes original research results and innovative applications in the interdisciplinary field of critical infrastructure protection. Also, it highlights the importance of weaving science, technology and policy in crafting sophisticated solutions that will help secure information, computer and network assets in the various critical infrastructure sectors. This book is the second volume in the annual series produced by the International Federation for Information Processing (IFIP) Working Group 11.10 on Critical Infrastructure Protection, an international community of scientists, engineers, practitioners and policy makers dedicated to advancing research, development and implementation efforts focused on infrastructure protection. The book contains a selection of twenty edited papers from the Second Annual IFIP WG 11.10 International Conference on Critical Infrastructure Protection held at George Mason University, Arlington, Virginia, USA in the spring of 2008.
Computers have become an essential component of modern biology. They help to manage the vast and increasing amount of biological data and continue to play an integral role in the discovery of new biological relationships. This in silico approach to biology has helped to reshape the modern biological sciences. With the biological revolution now among us, it is imperative that each scientist develop and hone today's bioinformatics skills, if only at a rudimentary level. Bioinformatics Methods and Protocols was conceived as part of the Methods in Molecular Biology series to meet this challenge and to provide the experienced user with useful tips and an up-to-date overview of current developments. It builds upon the foundation that was provided in the two-volume set published in 1994 entitled Computer Analysis of Sequence Data. We divided Bioinformatics Methods and Protocols into five parts, including a thorough survey of the basic sequence analysis software packages that are available at most institutions, as well as the design and implemen- tion of an essential introductory Bioinformatics course. In addition, we included sections describing specialized noncommercial software, databases, and other resources available as part of the World Wide Web and a stimul- ing discussion of some of the computational challenges biologists now face and likely future solutions.
New Infrastructures for Knowledge Production: Understanding E-Science offers a distinctive understanding of new infrastructures for knowledge production based in science and technology studies. This field offers a unique potential to assess systematically the prospects for new modes of science enabled by information and communication technologies. The authors use varied methodological approaches, reviewing the origins of initiatives to develop e-science infrastructures, exploring the diversity of the various solutions and the scientific cultures, which use them, and assessing the prospects for wholesale change in scientific structures and practices. The book contains practical advice for the design of appropriate technological solutions, and long range assessments of the prospects for change useful both to policy makers and those implementing institutional infrastructures. Readers interested in understanding contemporary science will gain a rich picture of the practices and the technologies that are shaping the knowledge production of the future.
Chapter 3 Specifying RTL Properties 61 3. 1 Definitions and concepts 62 62 3. 1. 1 Property 3. 1. 2 Events 65 3. 2 Property classification 65 Safety versus liveness 66 3. 2. 1 3. 2. 2 Constraint versus assertion 67 3. 2. 3 Declarative versus procedural 67 3. 3 RTL assertion specification techniques 68 RTL invariant assertions 69 3. 3. 1 3. 3. 2 Declaring properties with PSL 72 RTL cycle related assertions 73 3. 3. 3 3. 3. 4 PSL and default clock declaration 74 3. 3. 5 Specifying sequences 75 3. 3. 6 Specifying eventualities 80 3. 3. 7 PSL built-in functions 82 3. 4Pragma-based assertions 82 3. 5 SystemVerilog assertions 84 3. 5. 1 Immediate assertions 84 3. 5. 2Concurrent assertions 86 3. 5. 3 System functions 95 3. 6 PCI property specification example 96 3. 6. 1 PCI overview 96 3. 7 Summary 102 Chapter 4 PLI-Based Assertions 103 4. 1 Procedural assertions 104 4. 1. 1 A simple PLI assertion 105 4. 1. 2 Assertions within a simulation time slot 108 4. 1. 3 Assertions across simulation time slots 111 4. 1. 4 False firing across multiple time slots 116 4. 2 PLI-based assertion library 118 4. 2. 1 Assert quiescent state 119 4. 3 Summary 123 Chapter 5 Functional Coverage 125 5. 1 Verification approaches 126 5. 2 Understanding coverage 127 5. 2. 1 Controllability versus observability 128 5. 2.
Thinking Machines and the Philosophy of Computer Science: Concepts and Principles presents a conversation between established experts and new researchers in the field of philosophy and computer science about human and non-human relationships with the environment. This resource contains five sections including topics on philosophical analysis, the posterior ethical debate, the nature of computer simulations, and the crossroads between robotics, AI, cognitive theories and philosophy.
For over three decades now, silicon capacity has steadily been doubling every year and a half with equally staggering improvements continuously being observed in operating speeds. This increase in capacity has allowed for more complex systems to be built on a single silicon chip. Coupled with this functionality increase, speed improvements have fueled tremendous advancements in computing and have enabled new multi-media applications. Such trends, aimed at integrating higher levels of circuit functionality are tightly related to an emphasis on compactness in consumer electronic products and a widespread growth and interest in wireless communications and products. These trends are expected to persist for some time as technology and design methodologies continue to evolve and the era of Systems on a Chip has definitely come of age. While technology improvements and spiraling silicon capacity allow designers to pack more functions onto a single piece of silicon, they also highlight a pressing challenge for system designers to keep up with such amazing complexity. To handle higher operating speeds and the constraints of portability and connectivity, new circuit techniques have appeared. Intensive research and progress in EDA tools, design methodologies and techniques is required to empower designers with the ability to make efficient use of the potential offered by this increasing silicon capacity and complexity and to enable them to design, test, verify and build such systems.
Test functions (fault detection, diagnosis, error correction, repair, etc.) that are applied concurrently while the system continues its intended function are defined as on-line testing. In its expanded scope, on-line testing includes the design of concurrent error checking subsystems that can be themselves self-checking, fail-safe systems that continue to function correctly even after an error occurs, reliability monitoring, and self-test and fault-tolerant designs. On-Line Testing for VLSI contains a selected set of articles that discuss many of the modern aspects of on-line testing as faced today. The contributions are largely derived from recent IEEE International On-Line Testing Workshops. Guest editors Michael Nicolaidis, Yervant Zorian and Dhiraj Pradhan organized the articles into six chapters. In the first chapter the editors introduce a large number of approaches with an expanded bibliography in which some references date back to the sixties. On-Line Testing for VLSI is an edited volume of original research comprising invited contributions by leading researchers.
Within the last three decades, interest in the psychological
experience of human faces has drawn together cognitive science
researchers from diverse backgrounds. Computer scientists talk to
neural scientists who draw on the work of mathematicians who
explicitly influence those conducting behavioral experiments.
More and more, the shape of the IT organization is critical to business systems delivery, yet all too often this definition is approached in a haphazard fashion - often based on old theory and out-dated experiences rather than being moulded to the realities of the world in which we work. Shaping the IT Organization considers how one should go about the moulding of an IT function in order to ensure effective output from the resources within that organization. It focuses on understanding precisely the elements and challenges within such a definition. Key topics covered: What is an organization?: Issues and key considerations for IT from an organizational perspective, including the idea of the 'organization lifecycle' and the very real impact this can have within the IT environment. Why change?: The impact of generic business approaches demanded by current business models and pressures. Solutions vs Products: The IT organizational impact of moving from a product-based to a solutions-based business model. Outsourcing: The increasing trend to place critical elements of IT's delivery capability outside the core business means that IT functions are often poorly aligned to both manage these relationships and rise to the challenges that outsourcing offers. Resource Management: Fundamental questions about people and the need to adapt resource management approaches to take a radical approach to how we both manage and empower the people within those models in order to deliver what is required.
The book presents the state of the art in high performance computing and simulation on modern supercomputer architectures. It covers trends in hardware and software development in general and specifically the future of vector-based systems and heterogeneous architectures. The application contributions include computational fluid dynamics, physics, chemistry, astrophysics, and biology. Innovative application fields like multiphysics simulations and material science are presented. |
![]() ![]() You may like...
Global Consistency of Tolerances…
Fred van Houten, Hubert Kals
Hardcover
R4,609
Discovery Miles 46 090
Pearson Edexcel International A Level…
Joe Skrakowski, Harry Smith
Digital product license key
R978
Discovery Miles 9 780
Objective Soil Science
Raghavendra M Reddy, Prasad Siva P N, …
Hardcover
Cultural Landscapes Preservation and…
Maria Fe Schmitz, Cristina Herrero-Jauregui
Hardcover
The Interdisciplinary Handbook of…
Warren Mansell, Eva de Hullu, …
Paperback
|