![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Applications of computing > General
This book examines the implementation of emerging technology projects in the service-based Indian IT sector. The title shows how emerging technologies impact IT-enabled Services (ITeS) organizations and examines the mobility prospects for engineers and students looking to enter IT. Indian IT, dominated by organizations offering ITeS, provides services to clients across the world. Fueling this sector's growth are engineering graduates. Emerging technologies such as AI, Big Data, Cloud, and Blockchain, have brought the IT and engineering education sectors to a crossroads, with global implications. The IT sector is facing growing demands for new technology solutions from its clients and it is engineering students who are expected to upskill in order to build these solutions. The volume provides a rare, bottom-up look at the intersection of technology, education and organizational structure, based on an ethnographic study. This book will be a helpful and unique resource for managers in IT enabled Services grappling with emerging technologies, researchers looking at how emerging technologies impact organizations and for those developing innovative IT courses in higher education. Readers interested in the global structure of IT education and industry will also find a fresh, ethnographically-informed take on these issues.
This book is a valuable companion for everyone who is interested in the historical context of the co-evolution of financial markets and information technologies in the last 30 years. The contributors analyze system architectures and solution technologies in banking and finance by focusing on the particularities of certain practices and risks.
At the beginning we would like to introduce a refinement. The term 'VLSI planarization' means planarization of a circuit of VLSI, Le. the embedding of a VLSI circuit in the plane by different criteria such as the minimum number of connectors, the minimum total length of connectors, the minimum number of over-the-element routes, etc. A connector is designed to connect the broken sections of a net. It can be implemented in different ways depending on the technology. Connectors for a bipolar VLSI are implemented by diffused tun nels, for instance. By over-the-element route we shall mean a connection which intersects the enclosing rectangle of an element (or a cell). The possibility of the construction such connections during circuit planarization is reflected in element models and can be ensured, for example, by the availability of areas within the rectangles where connections may be routed. VLSI planarization is one of the basic stages (others will be discussed below) of the so called topological (in the mathematical sense) approach to VLSI design. This approach does not lie in the direction of the classical approach to automation of VLSI layout design. In the classical approach to computer aided design the placement and routing problems are solved successively. The topological approach, in contrast, allows one to solve both problems at the same time. This is achieved by constructing a planar embedding of a circuit and obtaining the proper VLSI layout on the basis of it.
Modeling, Control And Optimization Of Complex Systems is a collection of contributions from leading international researchers in the fields of dynamic systems, control theory, and modeling. These papers were presented at the Symposium on Modeling and Optimization of Complex Systems in honor of Larry Yu-Chi Ho in June 2001. They include exciting research topics such as: -modeling of complex systems,
Enabling Technologies for Computational Science assesses future application computing needs, identifies research directions in problem-solving environments (PSEs), addresses multi-disciplinary environments operating on the Web, proposes methodologies and software architectures for building adaptive and human-centered PSEs, and describes the role of symbolic computing in scientific and engineering PSEs. The book also includes an extensive bibliography of over 400 references. Enabling Technologies for Computational Science illustrates the extremely broad and interdisciplinary nature of the creation and application of PSEs. Authors represent academia, government laboratories and industry, and come from eight distinct disciplines (chemical engineering, computer science, ecology, electrical engineering, mathematics, mechanical engineering, psychology and wood sciences). This breadth and diversity extends into the computer science aspects of PSEs. These papers deal with topics such as artificial intelligence, computer-human interaction, control, data mining, graphics, language design and implementation, networking, numerical analysis, performance evaluation, and symbolic computing. Enabling Technologies for Computational Science provides an assessment of the state of the art and a road map to the future in the area of problem-solving environments for scientific computing. This book is suitable as a reference for scientists from a variety of disciplines interested in using PSEs for their research.
Computer-Aided Design of User Interfaces VI gathers the latest experience of experts, research teams and leading organisations involved in computer-aided design of user interactive applications. This area investigates how it is desirable and possible to support, to facilitate and to speed up the development life cycle of any interactive system: requirements engineering, early-stage design, detailed design, deelopment, deployment, evaluation, and maintenance. In particular, it stresses how the design activity could be better understood for different types of advanced interactive ubiquitous computing, and multi-device environments.
Software Security: Concepts & Practices is designed as a textbook and explores fundamental security theories that govern common software security technical issues. It focuses on the practical programming materials that will teach readers how to implement security solutions using the most popular software packages. It's not limited to any specific cybersecurity subtopics and the chapters touch upon a wide range of cybersecurity domains, ranging from malware to biometrics and more. Features The book presents the implementation of a unique socio-technical solution for real-time cybersecurity awareness. It provides comprehensible knowledge about security, risk, protection, estimation, knowledge and governance. Various emerging standards, models, metrics, continuous updates and tools are described to understand security principals and mitigation mechanism for higher security. The book also explores common vulnerabilities plaguing today's web applications. The book is aimed primarily at advanced undergraduates and graduates studying computer science, artificial intelligence and information technology. Researchers and professionals will also find this book useful.
This book teaches the basics of Oracle GoldenGate, which is a product that is used to simplify the process of Oracle Database Replication. GoldenGate can be used for reporting, failover, high availability, live reporting, data warehousing, and BigData ETL process, as well as connecting to multiple other data sources outside of Oracle Database such as SQL Server, MySQL, Teradata, PostgreSQL, and many others. The purpose of GoldenGate and its popularity is its ability to make the highly complex architecture of database replication into a much more simplistic task. This book teaches the reader how to use Oracle GoldenGate, from installation to troubleshooting.
Discusses digital fashion design and e-prototyping, including 2D/3D CAD, fashion simulation, fit analysis, digital pattern cutting, marker making, and the zero-waste concept Covers digital human modelling and VR/AR technology Details digital fashion business and promotion, including application of e-tools for supply chain, e-commerce, block chain technologies, big data, and AI
This book analyzes the fundamental issues faced when blockchain technology is applied to real-life applications. These concerns, not only in the realm of computer science, are caused by the nature of technological design. Blockchain is considered the foundation of a wide range of flexible ecosystems; its technology is an excellent mixture of mathematics, cryptography, incentive mechanisms, economics, and pertinent regulations. The book provides an essential understanding of why such fundamental issues arise, by revising the underlying theories. Blockchain theory is thus presented in an easy-to-understand, useful manner. Also explained is the reason why blockchain is hard to adopt for real-life problems but is valuable as a foundation for flexible ecosystems. Included are directions for solving those problems and finding suitable areas for blockchain applications in the future. The authors of this work are experts from a wide range of backgrounds such as cryptography, distributed computing, computer science, trust, identity, regulation, and standardization. Their contributions collected here will appeal to all who are interested in blockchain and the elements surrounding it.
This book collects and explains the many theorems concerning the existence of certificates of positivity for polynomials that are positive globally or on semialgebraic sets. A certificate of positivity for a real polynomial is an algebraic identity that gives an immediate proof of a positivity condition for the polynomial. Certificates of positivity have their roots in fundamental work of David Hilbert from the late 19th century on positive polynomials and sums of squares. Because of the numerous applications of certificates of positivity in mathematics, applied mathematics, engineering, and other fields, it is desirable to have methods for finding, describing, and characterizing them. For many of the topics covered in this book, appropriate algorithms, computational methods, and applications are discussed. This volume contains a comprehensive, accessible, up-to-date treatment of certificates of positivity, written by an expert in the field. It provides an overview of both the theory and computational aspects of the subject, and includes many of the recent and exciting developments in the area. Background information is given so that beginning graduate students and researchers who are not specialists can learn about this fascinating subject. Furthermore, researchers who work on certificates of positivity or use them in applications will find this a useful reference for their work.
This book deals with how to measure innovation in crisis management, drawing on data, case studies, and lessons learnt from different European countries. The aim of this book is to tackle innovation in crisis management through lessons learnt and experiences gained from the implementation of mixed methods through a practitioner-driven approach in a large-scale demonstration project (DRIVER+). It explores innovation from the perspective of the end-users by focusing on the needs and problems they are trying to address through a tool (be it an app, a drone, or a training program) and takes a deep dive into what is needed to understand if and to what extent the tool they have in mind can really bring innovation. This book is a toolkit for readers interested in understanding what needs to be in place to measure innovation: it provides the know-how through examples and best practices. The book will be a valuable source of knowledge for scientists, practitioners, researchers, and postgraduate students studying safety, crisis management, and innovation.
This monograph, for the first time in book form, considers the large structure of metric spaces as captured by bornologies: families of subsets that contain the singletons, that are stable under finite unions, and that are stable under taking subsets of its members. The largest bornology is the power set of the space and the smallest is the bornology of its finite subsets. Between these lie (among others) the metrically bounded subsets, the relatively compact subsets, the totally bounded subsets, and the Bourbaki bounded subsets. Classes of functions are intimately connected to various bornologies; e.g., (1) a function is locally Lipschitz if and only if its restriction to each relatively compact subset is Lipschitz; (2) a subset is Bourbaki bounded if and only if each uniformly continuous function on the space is bounded when restricted to the subset. A great deal of attention is given to the variational notions of strong uniform continuity and strong uniform convergence with respect to the members of a bornology, leading to the bornology of UC-subsets and UC-spaces. Spaces on which its uniformly continuous real-valued functions are stable under pointwise product are characterized in terms of the coincidence of the Bourbaki bounded subsets with a usually larger bornology. Special attention is given to Lipschitz and locally Lipschitz functions. For example, uniformly dense subclasses of locally Lipschitz functions within the real-valued continuous functions, Cauchy continuous functions, and uniformly continuous functions are presented. It is shown very generally that a function between metric spaces has a particular metric property if and only if whenever it is followed in a composition by a real-valued Lipschitz function, the composition has the property. Bornological convergence of nets of closed subsets, having Attouch-Wets convergence as a prototype, is considered in detail. Topologies of uniform convergence for continuous linear operators between normed spaces is explained in terms of the bornological convergence of their graphs. Finally, the idea of a bornological extension of a topological space is presented, and all regular extensions can be so realized.
This is a comprehensive study of various time-dependent scheduling problems in single-, parallel- and dedicated-machine environments. In addition to complexity issues and exact or heuristic algorithms which are typically presented in scheduling books, the author also includes more advanced topics such as matrix methods in time-dependent scheduling, time-dependent scheduling with two criteria and time-dependent two-agent scheduling. The reader should be familiar with the basic notions of calculus, discrete mathematics and combinatorial optimization theory, while the book offers introductory material on theory of algorithms, NP-complete problems, and the basics of scheduling theory. The author includes numerous examples, figures and tables, he presents different classes of algorithms using pseudocode, he completes all chapters with extensive bibliographies, and he closes the book with comprehensive symbol and subject indexes. The previous edition of the book focused on computational complexity of time-dependent scheduling problems. In this edition, the author concentrates on models of time-dependent job processing times and algorithms for solving time-dependent scheduling problems. The book is suitable for researchers working on scheduling, problem complexity, optimization, heuristics and local search algorithms.
David Busch's Sony Alpha a7 IV Guide to Digital Photography is the most comprehensive resource and reference for Sony s long-awaited 33-megapixel full frame mirrorless camera. Capable of 10 frame-per-second bursts even at full resolution, the a7 IV is fast enough for action photography, and its enhanced dynamic range delivers the image quality that the most demanding landscape or fine-art photographer requires. This camera s remarkable low-light performance, fast sensor-based phase detect autofocus (with real-time face and eye tracking in both still and movie modes), and improved 5-axis in-body image stabilization, the a7 IV has all the tools needed to take incredible images. This book will show you how to master those features as you explore the world of digital photography and hone your creativity with your a7 IV. Filled with detailed how-to steps and full-color illustrations, David Busch's Sony Alpha a7 IV Guide to Digital Photography covers every feature of this camera in depth, from taking your first photos through advanced details of setup, exposure, lens selection, lighting, and more, and relates each feature to specific photographic techniques and situations. Also included is the handy camera "roadmap" chapter, an easy-to-use visual guide to the camera's features and controls. Learn when to use every option and, more importantly, when not to use them, by following the author s recommended settings for each menu entry. With best-selling photographer and mentor David Busch as your guide, you'll quickly have full creative mastery of your camera s capabilities, whether you're shooting on the job, as an advanced enthusiast exploring full frame photography for the first time, or are just out for fun. Start building your knowledge and confidence, while bringing your vision to light with the Sony a7 IV.
This book is about describing the meaning of programming languages. The author teaches the skill of writing semantic descriptions as an efficient way to understand the features of a language. While a compiler or an interpreter offers a form of formal description of a language, it is not something that can be used as a basis for reasoning about that language nor can it serve as a definition of a programming language itself since this must allow a range of implementations. By writing a formal semantics of a language a designer can yield a far shorter description and tease out, analyse and record design choices. Early in the book the author introduces a simple notation, a meta-language, used to record descriptions of the semantics of languages. In a practical approach, he considers dozens of issues that arise in current programming languages and the key techniques that must be mastered in order to write the required formal semantic descriptions. The book concludes with a discussion of the eight key challenges: delimiting a language (concrete representation), delimiting the abstract content of a language, recording semantics (deterministic languages), operational semantics (non-determinism), context dependency, modelling sharing, modelling concurrency, and modelling exits. The content is class-tested and suitable for final-year undergraduate and postgraduate courses. It is also suitable for any designer who wants to understand languages at a deep level. Most chapters offer projects, some of these quite advanced exercises that ask for complete descriptions of languages, and the book is supported throughout with pointers to further reading and resources. As a prerequisite the reader should know at least one imperative high-level language and have some knowledge of discrete mathematics notation for logic and set theory.
The information infrastructure - comprising computers, embedded devices, networks and software systems - is vital to operations in every sector: inf- mation technology, telecommunications, energy, banking and ?nance, tra- portation systems, chemicals, agriculture and food, defense industrial base, public health and health care, national monuments and icons, drinking water and water treatment systems, commercial facilities, dams, emergency services, commercial nuclear reactors, materials and waste, postal and shipping, and government facilities. Global business and industry, governments, indeed - ciety itself, cannot function if major components of the critical information infrastructure are degraded, disabled or destroyed. This book, Critical Infrastructure Protection IV, is the fourth volume in the annual series produced by IFIP Working Group 11.10 on Critical Infr- tructure Protection, an active international community of scientists, engineers, practitioners and policy makers dedicated to advancing research, development and implementation e?orts related to critical infrastructure protection. The book presents original research results and innovative applications in the area of infrastructure protection. Also, it highlights the importance of weaving s- ence, technology and policy in crafting sophisticated, yet practical, solutions that will help secure information, computer and network assets in the various critical infrastructure sectors. This volume contains seventeen edited papers from the Fourth Annual IFIP Working Group 11.10 International Conference on Critical Infrastructure P- tection, held at the National Defense University, Washington, DC, March 15- 17, 2010. The papers were refereed by members of IFIP Working Group 11.10 and other internationally-recognized experts in critical infrastructure prot- tion.
This book explores the possible creation and impact of electronic markets underpinned by government. How far could electronic trade go? The author outlines a world in which open online marketplaces are routinely used to trade everything from office space to bicycle rental between individuals. Each transaction would be guaranteed by the system, not the reputation of the seller. Anyone could enter the market as an equal. The author argues that the electronic marketplaces of the future will have widespread and fundamental economic and social consequences. For more information about Guaranteed Electronic Markets visit the Gems Website at www.gems.org.uk
This volume is the first extensive study of the historical and philosophical connections between technology and mathematics. Coverage includes the use of mathematics in ancient as well as modern technology, devices and machines for computation, cryptology, mathematics in technological education, the epistemology of computer-mediated proofs, and the relationship between technological and mathematical computability. The book also examines the work of such historical figures as Gottfried Wilhelm Leibniz, Charles Babbage, Ada Lovelace, and Alan Turing.
This reference blends the concepts of optics and microwave theory. It is logically organized in two main parts, the first section deals with network analysis, while the second concentrates on signal analysis. As a whole, the text focuses on the fundamental aspects of optical networks. Methodology, rather than analysis, is the focus of the book. The discussion provides the tools you need to perform your own in-depth analysis of optical networks.
This book is a result of ISD2000-The Ninth International Conference on Infor mation Systems Development: Methods and Tools, Theory and Practice, held August 14-16, in Kristiansand, Norway. The ISD conference has its roots in the first Polish Scandinavian Seminar on Current Trends in Information Systems Development Method ologies, held in Gdansk, Poland in 1988. This year, as the conference carries into the new millennium this fine tradition, it was fitting that it returned to Scandinavia. Velkommen tilbake Next year, ISD crosses the North Sea and in the traditions of the Vikings, invades England. Like every ISD conference, ISD2000 gave participants an opportunity to express ideas on the current state of the art in information systems development, and to discuss and exchange views about new methods, tools and applications. This is particularly important now, since the field of ISD has seen rapid, and often bewildering, changes. To quote a Chinese proverb, we are indeed cursed, or blessed, depending on how we choose to look at it, to be "living in interesting times.""
This newly revised edition of the highly successful 1997 book offers professionals and students an up-to-date, in-depth understanding of how payments are made electronically across the Internet. The second edition explores the very latest developments in this quickly expanding area, including the newest security techniques and methods, and features a completely new chapter on the exciting advances in mobile commerce. Pub 8/01.
Advances in Computers, Volume 112, the latest volume in a series published since 1960, presents detailed coverage of innovations in computer hardware, software, theory, design and applications. Chapters in this updated volume include Mobile Application Quality Assurance, Advances in Combinatorial Testing, Advances in Applications of Object Constraint Language for Software Engineering, Advances in Techniques for Test Prioritization, Data Warehouse Testing, Mutation Testing Advances: An Analysis and Survey, Event-Based Concurrency: Applications, Abstractions, and Analyses, and A Taxonomy of Software Integrity Protection Techniques.
Placing contemporary technological developments in their historical context, this book argues for the importance of law in their regulation. Technological developments are focused upon overcoming physical and human constraints. There are no normative constraints inherent in the quest for ongoing and future technological development. In contrast, law proffers an essential normative constraint. Just because we can do something, does not mean that we should. Through the application of critical legal theory and jurisprudence to pro-actively engage with technology, this book demonstrates why legal thinking should be prioritised in emerging technological futures. This book articulates classic skills and values such as ethics and justice to ensure that future and ongoing legal engagements with socio-technological developments are tempered by legal normative constraints. Encouraging them to foreground questions of justice and critique when thinking about law and technology, the book addresses law students and teachers, lawyers and critical thinkers concerned with the proliferation of technology in our lives. |
You may like...
Discovering Computers 2018 - Digital…
Misty Vermaat, Steven Freund, …
Paperback
Natural Language Processing (NLP) and…
Florentina Hristea, Cornelia Caragea
Hardcover
Dynamic Web Application Development…
David Parsons, Simon Stobart
Paperback
Discovering Computers, Essentials…
Susan Sebok, Jennifer Campbell, …
Paperback
|