![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > General theory of computing > General
Organisational Semiotics offers an effective approach to analysing organisations and modelling organisational behaviour. The methods and techniques derived from Organisational Semiotics enable us to study the organisation by examining how information is created and used for communication, coordination and performance of actions towards organisational objectives. The latest development of the young discipline and its applications have been reported in this book, which provides a useful guide and a valuable reference to anyone working in the areas of organisational study and information systems development.
This book presents the most recent advances in fuzzy clustering techniques and their applications. The contents include Introduction to Fuzzy Clustering; Fuzzy Clustering based Principal Component Analysis; Fuzzy Clustering based Regression Analysis; Kernel based Fuzzy Clustering; Evaluation of Fuzzy Clustering; Self-Organized Fuzzy Clustering. This book is directed to the computer scientists, engineers, scientists, professors and students of engineering, science, computer science, business, management, avionics and related disciplines.
Geocomputation may be viewed as the application of a computational science paradigm to study a wide range of problems in geographical systems contexts.This volume presents a clear, comprehensive and thoroughly state-of-the-art overview of current research, written by leading figures in the field.It provides important insights into this new and rapidly developing field and attempts to establish the principles, and to develop techniques for solving real world problems in a wide array of application domains with a catalyst to greater understanding of what geocomputation is and what it entails.The broad coverage makes it invaluable reading for resarchers and professionals in geography, environmental and economic sciences as well as for graduate students of spatial science and computer science.
This text describes the advanced concepts and techniques used for ASIC chip synthesis, formal verification and static timing analysis, using the Synopsys suite of tools. In addition, the entire ASIC design flow methodology targeted for VDSM (Very-Deep-Sub-Micron) technologies is covered in detail. The emphasis of this book is on real-time application of Synopsys tools used to combat various problems seen at VDSM geometries. Readers are exposed to an effective design methodology for handling complex, sub-micron ASIC designs. Significance is placed on HDL coding styles, synthesis and optimization, dynamic simulation, formal verification, DFT scan insertion, links to layout, and static timing analysis. At each step, problems related to each phase of the design flow are identified, with solutions and work-arounds described in detail. In addition, crucial issues related to layout, which includes clock tree synthesis and back-end integration (links to layout) are also discussed at length. The book is intended for anyone who is involved in the ASIC design methodology, starting from RTL synthesis to final tape-out. Target audiences for this book are practicing ASIC design engineers and graduate students undertaking advanced courses in ASIC chip design and DFT techniques.
Digital Timing Macromodeling for VLSI Design Verification first of all provides an extensive history of the development of simulation techniques. It presents detailed discussion of the various techniques implemented in circuit, timing, fast-timing, switch-level timing, switch-level, and gate-level simulation. It also discusses mixed-mode simulation and interconnection analysis methods. The review in Chapter 2 gives an understanding of the advantages and disadvantages of the many techniques applied in modern digital macromodels. The book also presents a wide variety of techniques for performing nonlinear macromodeling of digital MOS subcircuits which address a large number of shortcomings in existing digital MOS macromodels. Specifically, the techniques address the device model detail, transistor coupling capacitance, effective channel length modulation, series transistor reduction, effective transconductance, input terminal dependence, gate parasitic capacitance, the body effect, the impact of parasitic RC-interconnects, and the effect of transmission gates. The techniques address major sources of errors in existing macromodeling techniques, which must be addressed if macromodeling is to be accepted in commercial CAD tools by chip designers. The techniques presented in Chapters 4-6 can be implemented in other macromodels, and are demonstrated using the macromodel presented in Chapter 3. The new techniques are validated over an extremely wide range of operating conditions: much wider than has been presented for previous macromodels, thus demonstrating the wide range of applicability of these techniques.
This book explores non-extensive statistical mechanics in non-equilibrium thermodynamics, and presents an overview of the strong nonlinearity of chaos and complexity in natural systems, drawing on relevant mathematics from topology, measure-theory, inverse and ill-posed problems, set-valued analysis, and nonlinear functional analysis. It offers a self-contained theory of complexity and complex systems as the steady state of non-equilibrium systems, denoting a homeostatic dynamic equilibrium between stabilizing order and destabilizing disorder.
This book encapsulates some work done in the DIRC project concerned with trust and responsibility in socio-technical systems. It brings together a range of disciplinary approaches - computer science, sociology and software engineering - to produce a socio-technical systems perspective on the issues surrounding trust in technology in complex settings. Computer systems can only bring about their purported benefits if functionality, users and usability are central to their design and deployment. Thus, technology can only be trusted in situ and in everyday use if these issues have been brought to bear on the process of technology design, implementation and use. The studies detailed in this book analyse the ways in which trust in technology is achieved and/or worked around in everyday situations in a range of settings - including hospitals, a steelworks, a public enquiry, the financial services sector and air traffic control.
Computer-Aided Verification is a collection of papers that begins with a general survey of hardware verification methods. Ms. Gupta starts with the issue of verification itself and develops a taxonomy of verification methodologies, focusing especially upon recent advances. Although her emphasis is hardware verification, most of what she reports applies to software verification as well. Graphical presentation is coming to be a de facto requirement for a friendly' user interface. The second paper presents a generic format for graphical presentations of coordinating systems represented by automata. The last two papers as a pair, present a variety of generic techniques for reducing the computational cost of computer-aided verification based upon explicit computational memory: the first of the two gives a time-space trade-off, while the second gives a technique which trades space for a (sometimes predictable) probability of error. Computer-Aided Verification is an edited volume of original research. This research work has also been published as a special issue of the journal Formal Methods in System Design, 1:2-3.
REAL-TIME MANAGEMENT OF RESOURCE ALLOCATION SYSTEMS focuses on the problem of managing the resource allocation taking place within the operational context of many contemporary technological applications, including flexibly automated production systems, automated railway and/or monorail transportation systems, electronic workflow management systems, and business transaction supporting systems. A distinct trait of all these applications is that they limit the role of the human element to remote high-level supervision, while placing the burden of the real-time monitoring and coordination of the ongoing activity upon a computerized control system. Hence, any applicable control paradigm must address not only the issues of throughput maximization, work-in-process inventory reduction, and delay and cost minimization, that have been the typical concerns for past studies on resource allocation, but it must also guarantee the operational correctness and the behavioral consistency of the underlying automated system. The resulting problem is rather novel for the developers of these systems, since, in the past, many of its facets were left to the jurisdiction of the present human intelligence. It is also complex, due to the high levels of choice a" otherwise known as flexibility a" inherent in the operation of these environments. This book proposes a control paradigm that offers a comprehensive and integrated solution to, both, the behavioral / logical and the performance-oriented control problems underlying the management of the resource allocation taking place in the aforementioned highly automated technological applications. Building upon a series of fairly recent results fromDiscrete Event Systems theory, the proposed paradigm is distinguished by: (i) its robustness to the experienced stochasticities and operational contingencies; (ii) its scalability to the large-scale nature of the target technological applications; and (iii) its operational efficiency. These three properties are supported through the adoption of a "closed-loop" structure for the proposed control scheme, and also, through a pertinent decomposition of the overall control function to a logical and a performance-oriented controller for the underlying resource allocation. REAL-TIME MANAGEMENT OF RESOURCE ALLOCATION SYSTEMS provides a rigorous study of the control problems addressed by each of these two controllers, and of their integration to a unified control function. A notion of optimal control is formulated for each of these problems, but it turns out that the corresponding optimal policies are computationally intractable. Hence, a large part of the book is devoted to the development of effective and computationally efficient approximations for these optimal control policies, especially for those that correspond to the more novel logical control problem.
Digital Humanities is rapidly evolving as a significant approach to/method of teaching, learning and research across the humanities. This is a first-stop book for people interested in getting to grips with digital humanities whether as a student or a professor. The book offers a practical guide to the area as well as offering reflection on the main objectives and processes, including: Accessible introductions of the basics of Digital Humanities through to more complex ideas A wide range of topics from feminist Digital Humanities, digital journal publishing, gaming, text encoding, project management and pedagogy Contextualised case studies Resources for starting Digital Humanities such as links, training materials and exercises Doing Digital Humanities looks at the practicalities of how digital research and creation can enhance both learning and research and offers an approachable way into this complex, yet essential topic.
XML Programming at its best! Discover A Book That Tells You What You Should Do and How! Instead of jumping right into the instructions, this book will provide you first with all the necessary concepts that you need to learn in order to make the learning process a whole lot easier. This way, you're sure not to get lost in confusion once you get to the more complex lessons provided in the latter chapters. Graphs and flowcharts, as well as sample codes, are provided for a more visual approach on your learning You will also learn the designs and forms of XML, and what's more convenient than getting to know both sides! Want to know More?
While a typical project manager s responsibility and accountability are both limited to a project with a clear start and end date, IT managers are responsible for an ongoing, ever-changing process for which they must adapt and evolve to stay updated, dependable, and secure in their field. Professional Advancements and Management Trends in the IT Sector offers the latest managerial trends within the field of information technology management. By collecting research from experts from around the world, in a variety of sectors and levels of technical expertise, this volume offers a broad variety of case studies, best practices, methodologies, and research within the field of information technology management. It will serve as a vital resource for practitioners and academics alike.
The papers in this volume comprise the refereed proceedings of the First Int- national Conference on Computer and Computing Technologies in Agriculture (CCTA 2007), in Wuyishan, China, 2007. This conference is organized by China Agricultural University, Chinese Society of Agricultural Engineering and the Beijing Society for Information Technology in Agriculture. The purpose of this conference is to facilitate the communication and cooperation between institutions and researchers on theories, methods and implementation of computer science and information technology. By researching information technology development and the - sources integration in rural areas in China, an innovative and effective approach is expected to be explored to promote the technology application to the development of modern agriculture and contribute to the construction of new countryside. The rapid development of information technology has induced substantial changes and impact on the development of China's rural areas. Western thoughts have exerted great impact on studies of Chinese information technology devel- ment and it helps more Chinese and western scholars to expand their studies in this academic and application area. Thus, this conference, with works by many prominent scholars, has covered computer science and technology and information development in China's rural areas; and probed into all the important issues and the newest research topics, such as Agricultural Decision Support System and Expert System, GIS, GPS, RS and Precision Farming, CT applications in Rural Area, Agricultural System Simulation, Evolutionary Computing, etc.
This book presents methodologies for analysing large data sets produced by the direct numerical simulation (DNS) of turbulence and combustion. It describes the development of models that can be used to analyse large eddy simulations, and highlights both the most common techniques and newly emerging ones. The chapters, written by internationally respected experts, invite readers to consider DNS of turbulence and combustion from a formal, data-driven standpoint, rather than one led by experience and intuition. This perspective allows readers to recognise the shortcomings of existing models, with the ultimate goal of quantifying and reducing model-based uncertainty. In addition, recent advances in machine learning and statistical inferences offer new insights on the interpretation of DNS data. The book will especially benefit graduate-level students and researchers in mechanical and aerospace engineering, e.g. those with an interest in general fluid mechanics, applied mathematics, and the environmental and atmospheric sciences.
Collaboration is a form of electronic communication in which individuals work on the same documents or processes over a period of time. When applied to technologies development, collaboration often has a focus on user-centered design and rapid prototyping, with a strong people-orientation. ""Collaborative Technologies and Applications for Interactive Information Design: Emerging Trends in User Experiences"" covers a wide range of emerging topics in collaboration, Web 2.0, and social computing, with a focus on technologies that impact the user experience. This cutting-edge source provides the latest international findings useful to practitioners, researchers, and academicians involved in education, ontologies, open source communities, and trusted networks.
In many countries, small businesses comprise over 95% of the proportion of private businesses and approximately half of the private workforce, with information technology being used in more than 90% of these businesses. As a result, governments worldwide are placing increasing importance upon the success of small business entrepreneurs and are providing increased resources to support this emphasis. Managing Information Technology in Small Business: Challenges and Solutions presents research in areas such as IT performance, electronic commerce, internet adoption, and IT planning methodologies and focuses on how these areas impact small businesses.
This volume examines the application of swarm intelligence in data mining, addressing the issues of swarm intelligence and data mining using novel intelligent approaches. The book comprises 11 chapters including an introduction reviewing fundamental definitions and important research challenges. Important features include a detailed overview of swarm intelligence and data mining paradigms, focused coverage of timely, advanced data mining topics, state-of-the-art theoretical research and application developments and contributions by pioneers in the field.
th The 20 anniversary of the IFIP WG6. 1 Joint International Conference on Fonna! Methods for Distributed Systems and Communication Protocols (FORTE XIII / PSTV XX) was celebrated by the year 2000 edition of the Conference, which was held for the first time in Italy, at Pisa, October 10-13, 2000. In devising the subtitle for this special edition --'Fonna! Methods Implementation Under Test' --we wanted to convey two main concepts that, in our opinion, are reflected in the contents of this book. First, the early, pioneering phases in the development of Formal Methods (FM's), with their conflicts between evangelistic and agnostic attitudes, with their over optimistic applications to toy examples and over-skeptical views about scalability to industrial cases, with their misconceptions and myths . . . , all this is essentially over. Many FM's have successfully reached their maturity, having been 'implemented' into concrete development practice: a number of papers in this book report about successful experiences in specifYing and verifYing real distributed systems and protocols. Second, one of the several myths about FM's - the fact that their adoption would eventually eliminate the need for testing - is still quite far from becoming a reality, and, again, this book indicates that testing theory and applications are still remarkably healthy. A total of 63 papers have been submitted to FORTEIPSTV 2000, out of which the Programme Committee has selected 22 for presentation at the Conference and inclusion in the Proceedings.
The more complex instructional design (ID) projects grow, the more a design language can support the success of the projects, and the continuing process of integration of technologies in education makes this issue even more relevant. The Hanndbook of visual languages for instructional design: Theories and practice serves as a practical guide for the integration of ID languages and notation systems into the practice of ID by presenting recent languages and notation systems for ID; exploring the connection between the use of ID languages and the integration of technologies in education, and assessing the benefits and drawbacks of the use of ID languages in specific project settings
Silicon-On-Insulator (SOI) CMOS technology has been regarded as another major technology for VLSI in addition to bulk CMOS technology. Owing to the buried oxide structure, SOI technology offers superior CMOS devices with higher speed, high density, and reduced second order effects for deep-submicron low-voltage, low-power VLSI circuits applications. In addition to VLSI applications, and because of its outstanding properties, SOI technology has been used to realize communication circuits, microwave devices, BICMOS devices, and even fiber optics applications. CMOS VLSI Engineering: Silicon-On-Insulator addresses three key factors in engineering SOI CMOS VLSI - processing technology, device modelling, and circuit designs are all covered with their mutual interactions. Starting from the SOI CMOS processing technology and the SOI CMOS digital and analog circuits, behaviors of the SOI CMOS devices are presented, followed by a CAD program, ST-SPICE, which incorporates models for deep-submicron fully-depleted mesa-isolated SOI CMOS devices and special purpose SOI devices including polysilicon TFTs. CMOS VLSI Engineering: Silicon-On-Insulator is written for undergraduate senior students and first-year graduate students interested in CMOS VLSI. It will also be suitable for electrical engineering professionals interested in microelectronics.
As miniaturisation deepens, and nanotechnology and its machines become more prevalent in the real world, the need to consider using quantum mechanical concepts to perform various tasks in computation increases. Such tasks include: the teleporting of information, breaking heretofore "unbreakable" codes, communicating with messages that betray eavesdropping, and the generation of random numbers. This is the first book to apply quantum physics to the basic operations of a computer, representing the ideal vehicle for explaining the complexities of quantum mechanics to students, researchers and computer engineers, alike, as they prepare to design and create the computing and information delivery systems for the future. Both authors have solid backgrounds in the subject matter at the theoretical and more practical level. While serving as a text for senior/grad level students in computer science/physics/engineering, this book has its primary use as an up-to-date reference work in the emerging interdisciplinary field of quantum computing - the only prerequisite being knowledge of calculus and familiarity with the concept of the Turing machine.
This volume contains the proceedings of IFIPTM 2010, the 4th IFIP WG 11.11 International Conference on Trust Management, held in Morioka, Iwate, Japan during June 16-18, 2010. IFIPTM 2010 provided a truly global platform for the reporting of research, development, policy, and practice in the interdependent arrears of privacy, se- rity, and trust. Building on the traditions inherited from the highly succe- ful iTrust conference series, the IFIPTM 2007 conference in Moncton, New Brunswick, Canada, the IFIPTM 2008 conference in Trondheim, Norway, and the IFIPTM 2009 conference at Purdue University in Indiana, USA, IFIPTM 2010 focused on trust, privacy and security from multidisciplinary persp- tives. The conference is an arena for discussion on relevant problems from both research and practice in the areas of academia, business, and government. IFIPTM 2010 was an open IFIP conference. The program of the conference featured both theoretical research papers and reports of real-world case studies. IFIPTM 2010 received 61 submissions from 25 di?erent countries: Japan (10), UK (6), USA (6), Canada (5), Germany (5), China (3), Denmark (2), India (2), Italy (2), Luxembourg (2), The Netherlands (2), Switzerland (2), Taiwan (2), Austria, Estonia, Finland, France, Ireland, Israel, Korea, Malaysia, Norway, Singapore, Spain, Turkey. The Program Committee selected 18 full papers for presentation and inclusion in the proceedings. In addition, the program and the proceedings include two invited papers by academic experts in the ?elds of trust management, privacy and security, namely, Toshio Yamagishi and Pamela Briggs |
![]() ![]() You may like...
Backward Fuzzy Rule Interpolation
Shangzhu Jin, Qiang Shen, …
Hardcover
R3,319
Discovery Miles 33 190
New Frontiers in Cryptography - Quantum…
Khaled Salah Mohamed
Hardcover
R3,611
Discovery Miles 36 110
System Dynamics for Mechanical Engineers
Matthew Davies, Tony L. Schmitz
Hardcover
R4,133
Discovery Miles 41 330
Prisoner 913 - The Release Of Nelson…
Riaan de Villiers, Jan-Ad Stemmet
Paperback
|