Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Computing & IT > General theory of computing > General
M-commerce (mobile-commerce) refers to e-commerce activities carried out via a mobile terminal such as a phone or PDA. M-commerce applications for both individuals and organizations are expected to grow considerably over the next few years. Mobile Commerce: Technology, Theory and Applications addresses issues pertaining to the development, deployment, and use of these applications. The objective of this book is to provide a single source of up-to-date information about mobile commerce including the technology (hardware and software) involved, research on the expected impact of this technology on businesses and consumers, and case studies describing state-of-the-art m-commerce applications and lessons learned.
Decades of research have shown that student collaboration in groups doesn't just happen; rather it needs to be a deliberate process facilitated by the instructor. Promoting collaboration in virtual learning environments presents a variety of challenges. Computer-Supported Collaborative Learning: Best Practices & Principles for Instructors answers the demand for a thorough resource on techniques to facilitate effective collaborative learning in virtual environments. This book provides must-have information on the role of the instructor in computer-supported collaborative learning, real-world perspectives on virtual learning group collaboration, and supporting learning group motivation.
Software has long been perceived as complex, at least within
Software Engineering circles. We have been living in a recognised
state of crisis since the first NATO Software Engineering
conference in 1968. Time and again we have been proven unable to
engineer reliable software as easily/cheaply as we imagined. Cost
overruns and expensive failures are the norm.
Whether by synergy or by synthesis, development and technology are becoming synonymous in every domain. Cases on Transnational Learning and Technologically Enabled Environments reports on national and international initiatives undertaken to adapt advancements in information and communication technology and successfully face the challenges posed by various social and economic forces. The international research in this book represents instances of institutions that are in transition as well as those that are readily using technology in education.
This research volume presents a sample of recent contributions related to the issue of quality-assessment for Web Based information in the context of information access, retrieval, and filtering systems. The advent of the Web and the uncontrolled process of documents' generation have raised the problem of declining quality assessment to information on the Web, by considering both the nature of documents (texts, images, video, sounds, and so on), the genre of documents ( news, geographic information, ontologies, medical records, products records, and so on), the reputation of information sources and sites, and, last but not least the actions performed on documents (content indexing, retrieval and ranking, collaborative filtering, and so on). The volume constitutes a compendium of both heterogeneous approaches and sample applications focusing specific aspects of the quality assessment for Web-based information for researchers, PhD students and practitioners carrying out their research activity in the field of Web information retrieval and filtering, Web information mining, information quality representation and management.
This book deals with an information-driven approach to plan materials discovery and design, iterative learning. The authors present contrasting but complementary approaches, such as those based on high throughput calculations, combinatorial experiments or data driven discovery, together with machine-learning methods. Similarly, statistical methods successfully applied in other fields, such as biosciences, are presented. The content spans from materials science to information science to reflect the cross-disciplinary nature of the field. A perspective is presented that offers a paradigm (codesign loop for materials design) to involve iteratively learning from experiments and calculations to develop materials with optimum properties. Such a loop requires the elements of incorporating domain materials knowledge, a database of descriptors (the genes), a surrogate or statistical model developed to predict a given property with uncertainties, performing adaptive experimental design to guide the next experiment or calculation and aspects of high throughput calculations as well as experiments. The book is about manufacturing with the aim to halving the time to discover and design new materials. Accelerating discovery relies on using large databases, computation, and mathematics in the material sciences in a manner similar to the way used to in the Human Genome Initiative. Novel approaches are therefore called to explore the enormous phase space presented by complex materials and processes. To achieve the desired performance gains, a predictive capability is needed to guide experiments and computations in the most fruitful directions by reducing not successful trials. Despite advances in computation and experimental techniques, generating vast arrays of data; without a clear way of linkage to models, the full value of data driven discovery cannot be realized. Hence, along with experimental, theoretical and computational materials science, we need to add a "fourth leg'' to our toolkit to make the "Materials Genome'' a reality, the science of Materials Informatics.
Since its first volume in 1960, Advances in Computers has
presented detailed coverage of innovations in computer hardware,
software, theory, design, and applications. It has also provided
contributors with a medium in which they can explore their subjects
in greater depth and breadth than journal articles usually allow.
As a result, many articles have become standard references that
continue to be of sugnificant, lasting value in this rapidly
expanding field.
Increasing numbers of businesses and Information Technology firms are outsourcing their software and Web development tasks. It is has been estimated that currently half of the Fortune 500 companies have utilized outsourcing for their development needs and estimates that by the end of 2008, 40% of U.S. companies will either develop, test, support, or store software overseas, with another 40% considering doing the same. Several industries, from computer software to telemarketing, have begun aggressively shifting white-collar work out of the United States. The United States currently accounts for more than half of worldwide spending on IT outsourcing, with a growing portion of this spending going to countries such as India, Russia, and the Philippines, and this trend will continue. Research has indicated that the primary problem is language because of idiomatic expressions and subtle cultural nuances associated with the use of particular words. Thus communication frequently breaks down when dealing with overseas companies.
In recent decades there has been incredible growth in the use of various internet applications by individuals and organizations who store sensitive information online on different servers. This greater reliance of organizations and individuals on internet technologies and applications increases the threat space and poses several challenges for implementing and maintaining cybersecurity practices. Constructing an Ethical Hacking Knowledge Base for Threat Awareness and Prevention provides innovative insights into how an ethical hacking knowledge base can be used for testing and improving the network and system security posture of an organization. It is critical for each individual and institute to learn hacking tools and techniques that are used by dangerous hackers in tandem with forming a team of ethical hacking professionals to test their systems effectively. Highlighting topics including cyber operations, server security, and network statistics, this publication is designed for technical experts, students, academicians, government officials, and industry professionals.
This edited, multi-author book gathers selected, peer-reviewed contributions based on papers presented at the 23rd International Workshop on Quantum Systems in Chemistry, Physics, and Biology (QSCP-XXIII), held in Mopani Camp, The Kruger National Park, South Africa, in September 2018. The content is primarily intended for scholars, researchers, and graduate students working at universities and scientific institutes who are interested in the structure, properties, dynamics, and spectroscopy of atoms, molecules, biological systems, and condensed matter.
The biggest challenges faced by the software industry are cost control and schedule control. As such, effective strategies for process improvement must be researched and implemented. Analyzing the Role of Risk Mitigation and Monitoring in Software Development is a critical scholarly resource that explores software risk and development as organizations continue to implement more applications across multiple technologies and a multi-tiered environment. Featuring coverage on a broad range of topics such as quantitative risk assessment, threat analysis, and software vulnerability management, this book is a vital resource for engineers, academicians, professionals, and researchers seeking current research on the importance of risk management in software development.
Enabling information interoperability, fostering legal knowledge usability and reuse, enhancing legal information search, in short, formalizing the complexity of legal knowledge to enhance legal knowledge management are challenging tasks, for which different solutions and lines of research have been proposed. During the last decade, research and applications based on the use of legal ontologies as a technique to represent legal knowledge has raised a very interesting debate about their capacity and limitations to represent conceptual structures in the legal domain. Making conceptual legal knowledge explicit would support the development of a web of legal knowledge, improve communication, create trust and enable and support open data, e-government and e-democracy activities. Moreover, this explicit knowledge is also relevant to the formalization of software agents and the shaping of virtual institutions and multi-agent systems or environments. This book explores the use of ontologism in legal knowledge
representation for semantically-enhanced legal knowledge systems or
web-based applications. In it, current methodologies, tools and
languages used for ontology development are revised, and the book
includes an exhaustive revision of existing ontologies in the legal
domain. The development of the Ontology of Professional Judicial
Knowledge (OPJK) is presented as a case study.
The aim of this book is to give a complete overview of the classical electromagnetic theory together with detailed insight in modern numerical methods for analysis of the problems in electromagnetics. Classical electromagnetic theory was developed in the 19th century but due to a wide range of applications from electrical apparatus such as motors or heaters to telecommunications, this subject is still very interesting. This book explains basic postulates and laws of the theory and its specialization to static and time-dependent problems. Special attention is given to utilization of computers in application of the modern numerical methods to solution of electromagnetic field problems.
This book includes high-quality papers presented at International Conference on Scientific and Natural Computing (SNC 2021), organized by Department of Applied Mathematics, Gautam Buddha University, Greater Noida in collaboration with IIT Roorkee and Technical University of Ostrava (VSB-TU) and technically sponsored by Soft Computing Research Society of India, held online during 5 - 6 February 2021. The topics include self-organizing migrating algorithm, genetic algorithms, swarm intelligence based techniques, evolutionary computing, fuzzy computing, probabilistic computing, genetic programming, particle swarm optimization, neuro computing, hybrid methods, deep learning, including convolutional neural networks, generative adversarial networks and auto-encoders, bio-inspired systems, data mining, data visualization, intelligent agents, engineering design optimization, multi-objective optimization, fault diagnosis, decision support, robotics, signal or image processing, system identification and modelling, systems integration, time series prediction, virtual reality, vision or pattern recognition, intelligent information retrieval, motion control and power electronics, Internet of Everything (IoE), control systems, and supply chain management.
"From the Preface: "
Social networks have emerged as a major trend in computing and social paradigms in the past few years. The social network model helps to inform the study of community behavior, allowing qualitative and quantitative assessments of how people communicate and the rules that govern communication. Social Networking and Community Behavior Modeling: Qualitative and Quantitative Measures provides a clear and consolidated view of current social network models. This work explores new methods for modeling, characterizing, and constructing social networks. Chapters contained in this book study critical security issues confronting social networking, the emergence of new mobile social networking devices and applications, network robustness, and how social networks impact the business aspects of organizations.
Tearing and interconnecting methods, such as FETI, FETI-DP, BETI, etc., are among the most successful domain decomposition solvers for partial differential equations. The purpose of this book is to give a detailed and self-contained presentation of these methods, including the corresponding algorithms as well as a rigorous convergence theory. In particular, two issues are addressed that have not been covered in any monograph yet: the coupling of finite and boundary elements within the tearing and interconnecting framework including exterior problems, and the case of highly varying (multiscale) coefficients not resolved by the subdomain partitioning. In this context, the book offers a detailed view to an active and up-to-date area of research.
Since 1960, Advances in Computers has chronicled the constantly
shifting theories and methods of Information Technology which
greatly shapes our lives today
'Rough Computing' explores the application of rough set theory, which has attracted attention because of the ability to enhance databases by allowing for the management of uncertainty, a comparative analysis between rough sets, and other intelligent data analysis.
Technological advancements have become an integral part of life, impacting the way we work, communicate, make decisions, learn, and play. As technology continually progresses, humans are being outpaced by its capabilities, and it is important for businesses, organizations, and individuals to understand how to optimize data and to implement new methods for more efficient knowledge discovery and information management and retrieval. Innovative Applications of Knowledge Discovery and Information Resources Management offers in-depth coverage on the pervasiveness of technological change with a collection of material on topics such as the impact of permeable work-life boundaries, burnout and turnover, big data usage, and computer-based learning. It proves a worthy source for academicians, practitioners, IT leaders, IT professionals, and advanced-level students interested in examining the ways in which technology is changing the world.
|
You may like...
Credit Where Credit Is Due - Respecting…
Patricia Ann Mabrouk, Judith Currano
Hardcover
R4,068
Discovery Miles 40 680
FDA and Intellectual Property Strategies…
Gerald B. Halt, John C. Donch, …
Hardcover
R4,054
Discovery Miles 40 540
New Directions in Law and Literature
Elizabeth S. Anker, Bernadette Meyler
Hardcover
R3,335
Discovery Miles 33 350
|