![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Applications of computing > General
From the Foreword: "...the presentation of real-time scheduling is probably the best in terms of clarity I have ever read in the professional literature. Easy to understand, which is important for busy professionals keen to acquire (or refresh) new knowledge without being bogged down in a convoluted narrative and an excessive detail overload. The authors managed to largely avoid theoretical-only presentation of the subject, which frequently affects books on operating systems. ... an indispensable [resource] to gain a thorough understanding of the real-time systems from the operating systems perspective, and to stay up to date with the recent trends and actual developments of the open-source real-time operating systems." -Richard Zurawski, ISA Group, San Francisco, California, USA Real-time embedded systems are integral to the global technological and social space, but references still rarely offer professionals the sufficient mix of theory and practical examples required to meet intensive economic, safety, and other demands on system development. Similarly, instructors have lacked a resource to help students fully understand the field. The information was out there, though often at the abstract level, fragmented and scattered throughout literature from different engineering disciplines and computing sciences. Accounting for readers' varying practical needs and experience levels, Real Time Embedded Systems: Open-Source Operating Systems Perspective offers a holistic overview from the operating-systems perspective. It provides a long-awaited reference on real-time operating systems and their almost boundless application potential in the embedded system domain. Balancing the already abundant coverage of operating systems with the largely ignored real-time aspects, or "physicality," the authors analyze several realistic case studies to introduce vital theoretical material. They also discuss popular open-source operating systems-Linux and FreRTOS, in particular-to help embedded-system designers identify the benefits and weaknesses in deciding whether or not to adopt more traditional, less powerful, techniques for a project.
Technology has impacted how many teachers develop methods of instruction in their classroom settings. The Continuous Practice Improvement (CPI) professional development program introduces teachers to infusing computers into the curriculum and classroom activities seamlessly. ""Infusing Technology into the Classroom: Continuous Practice Improvement"" retells compelling stories of a successful computer-related professional development program that was implemented into Kindergarten through eighth grade classrooms of a Philadelphia school. Through an analysis of the study, a theoretical model to guide technology infused professional development for teachers is discovered.
In information technology, unlike many other fields, the need to support the unique perspective of technologically advanced students and deliver technology-rich content presents unique challenges. Today's IT students need the ability to interact with their instructor in near-real time, interact with their peers and project team members, and access and manipulate technology tools in the pursuit of their educational objectives.""Handbook of Distance Learning for Real-Time and Asynchronous Information Technology Education"" delves deep into the construct of real-time, asynchronous education through information technology, pooling experiences from seasoned researchers and educators to detail their past successes and failures, discussing their techniques, hardships, and triumphs in the search for innovative and effective distance learning education for IT programs. This Premier Reference Source answers the increasing demand for a fundamental, decisive source on this cutting-edge issue facing all institutions, covering topics such as asynchronous communication, real-time instruction, multimedia content, content delivery, and distance education technologies.
When researchers in computer-mediated communications discuss digital textuality, they rarely venture beyond the now commonplace notion that computer textuality embodies contemporary post-structuralist theories. Written for students and faculty of contemporary literature and composition theories, this book is the first to move from general to specific considerations. Advancing from general considerations of how computers are changing literacy, Digital Fictions moves on to a specific consideration of how computers are altering one particular set of literature practices: reading and writing fiction. Suffused through the sensibility of a creative writer, this book includes an historical overview of writing stories on computers. In addition, Sloane conducts interviews with the makers of hypertext fictions (including Stuart Moulthrop, Michael Joyce, and Carolyn Guyer) and offers close reading of digital fictions. Making careful analyses of the meaning-making activities of both readers and writers of this emerging genre, this work is embedded in a perspective both feminist and semiotic. Digital Fictions explores and distinguishes among four distinct iterations of text-based digital fictions; text adventures, Carnegie Mellon University Oz Project, hypertext fictions, and MUDs. Ultimately, Sloane revises the rhetorical triangle and proposes a new rhetorical theory, one that attends to the materials, processes, and locations of stories told on-line.
The book provides suggestions on how to start using bionic optimization methods, including pseudo-code examples of each of the important approaches and outlines of how to improve them. The most efficient methods for accelerating the studies are discussed. These include the selection of size and generations of a study's parameters, modification of these driving parameters, switching to gradient methods when approaching local maxima, and the use of parallel working hardware. Bionic Optimization means finding the best solution to a problem using methods found in nature. As Evolutionary Strategies and Particle Swarm Optimization seem to be the most important methods for structural optimization, we primarily focus on them. Other methods such as neural nets or ant colonies are more suited to control or process studies, so their basic ideas are outlined in order to motivate readers to start using them. A set of sample applications shows how Bionic Optimization works in practice. From academic studies on simple frames made of rods to earthquake-resistant buildings, readers follow the lessons learned, difficulties encountered and effective strategies for overcoming them. For the problem of tuned mass dampers, which play an important role in dynamic control, changing the goal and restrictions paves the way for Multi-Objective-Optimization. As most structural designers today use commercial software such as FE-Codes or CAE systems with integrated simulation modules, ways of integrating Bionic Optimization into these software packages are outlined and examples of typical systems and typical optimization approaches are presented. The closing section focuses on an overview and outlook on reliable and robust as well as on Multi-Objective-Optimization, including discussions of current and upcoming research topics in the field concerning a unified theory for handling stochastic design processes.
Ontologies and formal representations of knowledge are extremely powerful tools for modeling and managing large applications in several domains ranging from knowledge engineering, to data mining, to the semantic web. Ontology Theory, Management and Design: Advanced Tools and Models, explores the wide range of applications for ontologies, while providing a complete view of the both the theory behind the design and the problems posed by the practical development and use of ontologies. This reference presents an in-depth and forward looking analysis of current research, illustrating the importance of this field and pointing toward to the future of knowledge engineering, management and information technology.
Nature has long provided the inspiration for a variety of scientific discoveries in engineering, biomedicine, and computing, though only recently have these elements of nature been used directly in computational systems. Natural Computing for Simulation and Knowledge Discovery investigates the latest developments in nature-influenced technologies. Within its pages, readers will find an in-depth analysis of such advances as cryptographic solutions based on cell division, the creation and manipulation of biological computers, and particle swarm optimisation techniques. Scientists, practitioners, and students in fields such as computing, mathematics, and molecular science will make use of this essential reference to explore current trends in natural computation and advance nature-inspired technologies to the next generation.
This book explains the development of theoretical computer science in its early stages, specifically from 1965 to 1990. The author is among the pioneers of theoretical computer science, and he guides the reader through the early stages of development of this new discipline. He explains the origins of the field, arising from disciplines such as logic, mathematics, and electronics, and he describes the evolution of the key principles of computing in strands such as computability, algorithms, and programming. But mainly it's a story about people - pioneers with diverse backgrounds and characters came together to overcome philosophical and institutional challenges and build a community. They collaborated on research efforts, they established schools and conferences, they developed the first related university courses, they taught generations of future researchers and practitioners, and they set up the key publications to communicate and archive their knowledge. The book is a fascinating insight into the field as it existed and evolved, it will be valuable reading for anyone interested in the history of computing.
The concept of innovation management and learning organizations concepts strongly emphasize the high role of human/intellectual capital in the company and the crucial function of knowledge in modern society. However, there is often a paradox between managerial language and actual practice in many organizations: on one hand, knowledge-workers are perceived as the most valued members of organizations while, on the other, they are being manipulated and "engineered"-commonly driven to burn-out, and deprived of family life. All this leads to the emergence of new organizational phenomena that, up to now, have been insufficiently analyzed and described. Management Practices in High-Tech Environments studies this issue thoroughly from an international, comparative, cross-cultural perspective, presenting cutting-edge research on management practices in American, European, Asian and Middle-Eastern high-tech companies, with particular focus on fieldwork-driven, but reflective, contributions.
This book presents the latest findings and ongoing research in connection with green information systems and green information & communication technology (ICT). It provides valuable insights into a broad range of cross-cutting concerns in ICT and the environmental sciences, and showcases how ICT can be used to effectively address environmental and energy efficiency issues. Offering a selection of extended contributions to the 31st International Conference EnviroInfo 2017, it is essential reading for anyone looking to expand their expertise in the area.
Handbook of Research on Ambient Intelligence and Smart Environments: Trends and Perspectives covers the cutting-edge aspects of AMI applications, specifically those involving the effective design, realization, and implementation of a comprehensive AmI application. This pertinent publication targets researchers and practitioners in Ambient Intelligence, as well as those in ubiquitous and pervasive computing, artificial intelligence, sensor networks, knowledge representation, automated reasoning and learning, system and software engineering, and man-machine interfaces.
In today's technology-crazed environement, distance learning is touted as a cost-effective option for delivering employee training and higher education programs, such as as bachelor's, master's and even doctroal degrees. Distance Learning Technologies: Issues, Trends and Opportunities provides readers with an in-depth understanding of distance learning and the technologies available for this innovative medium of learning and instruction. It races the development of distance learning from its history and includes suggestions for a solid strategic implementation plan to ensure its successful and effective deployment.
The world of corporate management benefits when organizations realize the profitability, reliability, and flexibility obtained through IT standardization. Toward Corporate IT Standardization Management: Frameworks and Solutions details the IT standards conceptual model through insightful case studies that illustrate the factors affecting the performance of business processes. By offering organizations the opportunity to enhance process performance through IT standardization, this reference work demonstrates the effectiveness of IT standards, and the applicable techniques for implementation and management of such practices. This book features information useful to educators and students in the fields of Information Systems, IT-Management, Business Studies, and Economics, as well as IT practitioners and IS Managers.
This book examines construction safety from the perspective of informatics and econometrics. It demonstrates the potential of employing various information technology approaches to share construction safety knowledge. In addition, it presents the application of econometrics in construction safety studies, such as an analytic hierarchy process used to create a construction safety index. It also discusses structure equation and dynamic panel models for the analysis of construction safety claims. Lastly, it describes the use of mathematical and econometric models to investigate construction practitioners' safety.
This book contains the full papers presented at ICCEBS 2013 - the 1st International Conference on Computational and Experimental Biomedical Sciences, which was organized in Azores, in October 2013. The included papers present and discuss new trends in those fields, using several methods and techniques, including active shape models, constitutive models, isogeometric elements, genetic algorithms, level sets, material models, neural networks, optimization and the finite element method, in order to address more efficiently different and timely applications involving biofluids, computer simulation, computational biomechanics, image based diagnosis, image processing and analysis, image segmentation, image registration, scaffolds, simulation and surgical planning. The main audience for this book consists of researchers, Ph.D students and graduate students with multidisciplinary interests related to the areas of artificial intelligence, bioengineering, biology, biomechanics, computational fluid dynamics, computational mechanics, computational vision, histology, human motion, imagiology, applied mathematics, medical image, medicine, orthopaedics, rehabilitation, speech production and tissue engineering.
MACSYMA for Statisticians introduces the basic principles and ideas of MACSYMA so that it can be used for mathematical computations, manipulations and simplifications. MACSYMA is a large computer programming system which is designed to perform a wide spectrum of mathematical computations and manipulations in symbolic as well as numerical form. It operates interactively and displays results in ordinary mathematical notation. MACSYMA is perhaps most well known for its capabilities in algebraic manipulations and simplifications. However, its usefulness extends much further than that. In the field of analysis, MACSYMA performs differentiation, integration and the taking of limits. It can compute definite integrals, change variables, perform integration by parts, reduce rational polynomials into partial fractions and take Laplace transforms. MACSYMA has several commands relating to Taylor Series and asymptotic expansions.
Ecological Assessment of Polymers Strategies for Product Stewardship and Regulatory Programs John D. Hamilton and Roger Sutcliffe The expense of providing ecological assessments of new commercial products is formidable. The cost of the failure to comply with the current regulations--measured in fines, liability damages, and loss of public trust--is potentially much, much higher. Establishing effective environmental product stewardship strategies for assessment upfront not only promotes initial and continued compliance, it can reduce costs via the more efficient development of new products. Based on the collaboration of the Rohm and Haas Company and S.C. Johnson Wax with other manufacturers, contract laboratories, universities, and government agencies, Ecological Assessment of Polymers is the first complete reference to provide environment-oriented information about polymers from a product development and regulatory compliance perspective. A number of books deal with the potential hazards of pesticides and solvents. This is the first to focus on the commercial synthetic polymers that exist in laundry detergents, paints, super-absorbent diapers, packaging materials, and many other consumer and industrial products. Using the principles of environmental toxicology and chemistry, Ecological Assessment of Polymers approaches environmental evaluation as a decision-making process. The book demonstrates how assessment can be used as a planning tool for developing products, reducing potential liability, and creating new products, processes, and disposal systems. Featured discussions:
The CMOS Cookbook contains all you need to know to understand and
successfully use CMOS (Complementary Metal-Oxide Semiconductor)
integrated circuits. Written in a "cookbook" format that requires
little math, this practical, user-oriented book covers all the
basics for working with digital logic and many of its end
appilations.
The evolution of soft computing applications has offered a multitude of methodologies and techniques that are useful in facilitating new ways to address practical and real scenarios in a variety of fields. In particular, these concepts have created significant developments in the engineering field. Soft Computing Techniques and Applications in Mechanical Engineering is a pivotal reference source for the latest research findings on a comprehensive range of soft computing techniques applied in various fields of mechanical engineering. Featuring extensive coverage on relevant areas such as thermodynamics, fuzzy computing, and computational intelligence, this publication is an ideal resource for students, engineers, research scientists, and academicians involved in soft computing techniques and applications in mechanical engineering areas.
This book examines a writing activity that has recently fallen into disrepute. Outlining has a bad reputation among students, even though many teachers and textbooks still recommend the process. In part, the author argues, the medium is to blame. Paper and ink make the revision difficult. But if one uses an electronic outliner, the activity can be very helpful in developing a thoughtful and effective document, particularly one that spans many pages and deals with a complicated subject. Outlining Goes Electronic takes an historical approach, examining the way people developed the idea of outlining, from the classical period to the present. We see that the medium in which people worked strongly shaped their assumptions, ideas, and use of outlines. In developing a theoretical model of outlining as an activity, the author argues that a relatively new electronic tool-software that accelerates and performs the process of outlining-can give us a new perspective from which to engage previous classroom models of writing, recent writing theory, and current practice in the technical writing field.
This is a volume of chapters on the historical study of information, computing, and society written by seven of the most senior, distinguished members of the History of Computing field. These are edited, expanded versions of papers presented in a distinguished lecture series in 2018 at the University of Colorado Boulder - in the shadow of the Flatirons, the front range of the Rocky Mountains. Topics range widely across the history of computing. They include the digitalization of computer and communication technologies, gender history of computing, the history of data science, incentives for innovation in the computing field, labor history of computing, and the process of standardization. Authors were given wide latitude to write on a topic of their own choice, so long as the result is an exemplary article that represents the highest level of scholarship in the field, producing articles that scholars in the field will still look to read twenty years from now. The intention is to publish articles of general interest, well situated in the research literature, well grounded in source material, and well-polished pieces of writing. The volume is primarily of interest to historians of computing, but individual articles will be of interest to scholars in media studies, communication, computer science, cognitive science, general and technology history, and business.
This textbook intends to be a comprehensive and substantially self-contained two-volume book covering performance, reliability, and availability evaluation subjects. The volumes focus on computing systems, although the methods may also be applied to other systems. The first volume covers Chapter 1 to Chapter 14, whose subtitle is ``Performance Modeling and Background". The second volume encompasses Chapter 15 to Chapter 25 and has the subtitle ``Reliability and Availability Modeling, Measuring and Workload, and Lifetime Data Analysis". This text is helpful for computer performance professionals for supporting planning, design, configuring, and tuning the performance, reliability, and availability of computing systems. Such professionals may use these volumes to get acquainted with specific subjects by looking at the particular chapters. Many examples in the textbook on computing systems will help them understand the concepts covered in each chapter. The text may also be helpful for the instructor who teaches performance, reliability, and availability evaluation subjects. Many possible threads could be configured according to the interest of the audience and the duration of the course. Chapter 1 presents a good number of possible courses programs that could be organized using this text.
Standardization has the potential to shape, expand, and create markets. Information technology has undergone a rapid transformation in the application of standards in practice, and recent developments have augmented the need for the divulgence of supplementary research. Standardization Research in Information Technology: New Perspectives amasses cutting-edge research on the application of standards in the market, covering topics such as corporate standardization, linguistic qualities of international standards, the role of individuals in standardization, and the development, use, application, and influence of information technology in standardization techniques. |
You may like...
Computer-Graphic Facial Reconstruction
John G. Clement, Murray K. Marks
Hardcover
R2,327
Discovery Miles 23 270
Discovering Computers - Digital…
Misty Vermaat, Mark Ciampa, …
Paperback
Infinite Words, Volume 141 - Automata…
Dominique Perrin, Jean-Eric Pin
Hardcover
R4,065
Discovery Miles 40 650
Stability, Periodicity, and Related…
Michal Fečkan, Marius-F Danca
Hardcover
Discovering Computers 2018 - Digital…
Misty Vermaat, Steven Freund, …
Paperback
R1,136
Discovery Miles 11 360
|