![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > General theory of computing
This book offers a self-study program on how mathematics, computer science and science can be profitably and seamlessly intertwined. This book focuses on two variable ODE models, both linear and nonlinear, and highlights theoretical and computational tools using MATLAB to explain their solutions. It also shows how to solve cable models using separation of variables and the Fourier Series.
FORTRAN Programming success in a day:Beginners guide to fast, easy and efficient learning of FORTRAN programming What is Fortran? How can you become proficient in Fortran Programming? The perfect starter book for anyone trying to learn this specific type of programming! Want to learn quick data types? Need examples on data types How about variables? Or needing to know how to manipulate variables with Fortran Programming? Every type of Intrinsic Functions in Fortran right here! Finally lets dive into Conditional statements and put in terms you or anyone with no background in programming can understand!
This book investigates the coordinated power management of multi-tenant data centers that account for a large portion of the data center industry. The authors include discussion of their quick growth and their electricity consumption, which has huge economic and environmental impacts. This book covers the various coordinated management solutions in the existing literature focusing on efficiency, sustainability, and demand response aspects. First, the authors provide a background on the multi-tenant data center covering the stake holders, components, power infrastructure, and energy usage. Then, each power management mechanism is described in terms of motivation, problem formulation, challenges and solution.
System on chips designs have evolved from fairly simple unicore, single memory designs to complex heterogeneous multicore SoC architectures consisting of a large number of IP blocks on the same silicon. To meet high computational demands posed by latest consumer electronic devices, most current systems are based on such paradigm, which represents a real revolution in many aspects in computing. The attraction of multicore processing for power reduction is compelling. By splitting a set of tasks among multiple processor cores, the operating frequency necessary for each core can be reduced, allowing to reduce the voltage on each core. Because dynamic power is proportional to the frequency and to the square of the voltage, we get a big gain, even though we may have more cores running. As more and more cores are integrated into these designs to share the ever increasing processing load, the main challenges lie in efficient memory hierarchy, scalable system interconnect, new programming paradigms, and efficient integration methodology for connecting such heterogeneous cores into a single system capable of leveraging their individual flexibility. Current design methods tend toward mixed HW/SW co-designs targeting multicore systems on-chip for specific applications. To decide on the lowest cost mix of cores, designers must iteratively map the device's functionality to a particular HW/SW partition and target architectures. In addition, to connect the heterogeneous cores, the architecture requires high performance complex communication architectures and efficient communication protocols, such as hierarchical bus, point-to-point connection, or Network-on-Chip. Software development also becomes far more complex due to the difficulties in breaking a single processing task into multiple parts that can be processed separately and then reassembled later. This reflects the fact that certain processor jobs cannot be easily parallelized to run concurrently on multiple processing cores and that load balancing between processing cores - especially heterogeneous cores - is very difficult.
As a field, computer science occupies a unique scientific space, in that its subject matter can exist in both physical and abstract realms. An artifact such as software is both tangible and not, and must be classified as something in between, or "liminal." The study and production of liminal artifacts allows for creative possibilities that are, and have been, possible only in computer science. In It Began With Babbage, Subrata Dasgupta examines the unique history of computer science in terms of its creative innovations, spanning back to Charles Babbage in 1819. Since all artifacts of computer science are conceived with a use in mind, the computer scientist is not concerned with the natural laws that govern disciplines like physics or chemistry; the computer scientist is more concerned with the concept of purpose. This requirement lends itself to a type of creative thinking that, as Dasgupta shows us, has exhibited itself throughout the history of computer science. From Babbage's Difference Engine, through the Second World War, to the establishment of the term "Computer Science" in 1956, It Began With Babbage traces a lively and complete history of computer science.
This book contains an edited selection of the papers accepted for presentation and discussion at the first International Symposium on Qualitative Research (ISQR2016), held in Porto, Portugal, July 12th-14th, 2016. The book and the symposium features the four main application fields Education, Health, Social Sciences and Engineering and Technology and seven main subjects: Rationale and Paradigms of Qualitative Research (theoretical studies, critical reflection about epistemological dimensions, ontological and axiological); Systematization of approaches with Qualitative Studies (literature review, integrating results, aggregation studies, meta -analysis, meta- analysis of qualitative meta- synthesis, meta- ethnography); Qualitative and Mixed Methods Research (emphasis in research processes that build on mixed methodologies but with priority to qualitative approaches); Data Analysis Types (content analysis , discourse analysis , thematic analysis , narrative analysis , etc.); Innovative processes of Qualitative Data Analysis (design analysis, articulation and triangulation of different sources of data - images, audio, video); Qualitative Research in Web Context (eResearch, virtual ethnography, interaction analysis , latent corpus on the internet, etc.); Qualitative Analysis with Support of Specific Software (usability studies, user experience, the impact of software on the quality of research.
This book constitutes the refereed post-conference proceedings of the 10th IFIP WG 5.14 International Conference on Computer and Computing Technologies in Agriculture, CCTA 2016, held in Dongying, China, in October 2016. The 55 revised papers presented were carefully reviewed and selected from 128 submissions. They cover a wide range of interesting theories and applications of information technology in agriculture, including intelligent sensing, cloud computing, key technologies of the Internet of Things, precision agriculture, animal husbandry information technology, including Internet + modern animal husbandry, livestock big data platform and cloud computing applications, intelligent breeding equipment, precision production models, water product networking and big data , including fishery IoT, intelligent aquaculture facilities, and big data applications.
Decades of research have shown that student collaboration in groups doesn't just happen; rather it needs to be a deliberate process facilitated by the instructor. Promoting collaboration in virtual learning environments presents a variety of challenges. Computer-Supported Collaborative Learning: Best Practices & Principles for Instructors answers the demand for a thorough resource on techniques to facilitate effective collaborative learning in virtual environments. This book provides must-have information on the role of the instructor in computer-supported collaborative learning, real-world perspectives on virtual learning group collaboration, and supporting learning group motivation.
A collection of papers from ISCIS 27th Annual Symposium.
This edited book presents scientific results of the 15th IEEE/ACIS International Conference on Computer and Information Science (ICIS 2016) which was held on June 26- 29 in Okayama, Japan. The aim of this conference was to bring together researchers and scientists, businessmen and entrepreneurs, teachers, engineers, computer users, and students to discuss the numerous fields of computer science and to share their experiences and exchange new ideas and information in a meaningful way. Research results about all aspects (theory, applications and tools) of computer and information science, and to discuss the practical challenges encountered along the way and the solutions adopted to solve them. The conference organizers selected the best papers from those papers accepted for presentation at the conference. The papers were chosen based on review scores submitted by members of the program committee, and underwent further rigorous rounds of review. This publication captures 12 of the conference's most promising papers, and we impatiently await the important contributions that we know these authors will bring to the field of computer and information science.
The biggest challenges faced by the software industry are cost control and schedule control. As such, effective strategies for process improvement must be researched and implemented. Analyzing the Role of Risk Mitigation and Monitoring in Software Development is a critical scholarly resource that explores software risk and development as organizations continue to implement more applications across multiple technologies and a multi-tiered environment. Featuring coverage on a broad range of topics such as quantitative risk assessment, threat analysis, and software vulnerability management, this book is a vital resource for engineers, academicians, professionals, and researchers seeking current research on the importance of risk management in software development.
In a fragment entitled Elementa Nova Matheseos Universalis (1683?) Leibniz writes "the mathesis [...] shall deliver the method through which things that are conceivable can be exactly determined"; in another fragment he takes the mathesis to be "the science of all things that are conceivable." Leibniz considers all mathematical disciplines as branches of the mathesis and conceives the mathesis as a general science of forms applicable not only to magnitudes but to every object that exists in our imagination, i.e. that is possible at least in principle. As a general science of forms the mathesis investigates possible relations between "arbitrary objects" ("objets quelconques"). It is an abstract theory of combinations and relations among objects whatsoever. In 1810 the mathematician and philosopher Bernard Bolzano published a booklet entitled Contributions to a Better-Grounded Presentation of Mathematics. There is, according to him, a certain objective connection among the truths that are germane to a certain homogeneous field of objects: some truths are the "reasons" ("Grunde") of others, and the latter are "consequences" ("Folgen") of the former. The reason-consequence relation seems to be the counterpart of causality at the level of a relation between true propositions. Arigorous proof is characterized in this context as a proof that shows the reason of the proposition that is to be proven. Requirements imposed on rigorous proofs seem to anticipate normalization results in current proof theory. The contributors of Mathesis Universalis, Computability and Proof, leading experts in the fields of computer science, mathematics, logic and philosophy, show the evolution of these and related ideas exploring topics in proof theory, computability theory, intuitionistic logic, constructivism and reverse mathematics, delving deeply into a contextual examination of the relationship between mathematical rigor and demands for simplification.
This book contains some selected papers from the International Conference on Extreme Learning Machine 2015, which was held in Hangzhou, China, December 15-17, 2015. This conference brought together researchers and engineers to share and exchange R&D experience on both theoretical studies and practical applications of the Extreme Learning Machine (ELM) technique and brain learning. This book covers theories, algorithms ad applications of ELM. It gives readers a glance of the most recent advances of ELM.
Since its first volume in 1960, Advances in Computers has
presented detailed coverage of innovations in computer hardware,
software, theory, design, and applications. It has also provided
contributors with a medium in which they can explore their subjects
in greater depth and breadth than journal articles usually allow.
As a result, many articles have become standard references that
continue to be of sugnificant, lasting value in this rapidly
expanding field.
This book presents cutting-edge developments in the advanced mathematical theories utilized in computer graphics research - fluid simulation, realistic image synthesis, and texture, visualization and digital fabrication. A spin-off book from the International Symposium on Mathematical Progress in Expressive Image Synthesis in 2016 and 2017 (MEIS2016/2017) held in Fukuoka, Japan, it includes lecture notes and an expert introduction to the latest research presented at the symposium. The book offers an overview of the emerging interdisciplinary themes between computer graphics and driven mathematic theories, such as discrete differential geometry. Further, it highlights open problems in those themes, making it a valuable resource not only for researchers, but also for graduate students interested in computer graphics and mathematics.
Social networks have emerged as a major trend in computing and social paradigms in the past few years. The social network model helps to inform the study of community behavior, allowing qualitative and quantitative assessments of how people communicate and the rules that govern communication. Social Networking and Community Behavior Modeling: Qualitative and Quantitative Measures provides a clear and consolidated view of current social network models. This work explores new methods for modeling, characterizing, and constructing social networks. Chapters contained in this book study critical security issues confronting social networking, the emergence of new mobile social networking devices and applications, network robustness, and how social networks impact the business aspects of organizations.
The Semantic Web is characterized by the existence of a very large number of distributed semantic resources, which together define a network of ontologies. These ontologies in turn are interlinked through a variety of different meta-relationships such as versioning, inclusion, and many more. This scenario is radically different from the relatively narrow contexts in which ontologies have been traditionally developed and applied, and thus calls for new methods and tools to effectively support the development of novel network-oriented semantic applications. This book by Suarez-Figueroa et al. provides the necessary methodological and technological support for the development and use of ontology networks, which ontology developers need in this distributed environment. After an introduction, in its second part the authors describe the NeOn Methodology framework. The book's third part details the key activities relevant to the ontology engineering life cycle. For each activity, a general introduction, methodological guidelines, and practical examples are provided. The fourth part then presents a detailed overview of the NeOn Toolkit and its plug-ins. Lastly, case studies from the pharmaceutical and the fishery domain round out the work. The book primarily addresses two main audiences: students (and their lecturers) who need a textbook for advanced undergraduate or graduate courses on ontology engineering, and practitioners who need to develop ontologies in particular or Semantic Web-based applications in general. Its educational value is maximized by its structured approach to explaining guidelines and combining them with case studies and numerous examples. The description of the open source NeOn Toolkit provides an additional asset, as it allows readers to easily evaluate and apply the ideas presented." |
You may like...
Statistics for Managers Using Microsoft…
David Levine, David Stephan, …
Paperback
R2,568
Discovery Miles 25 680
Excel 2019 for Advertising Statistics…
Thomas J. Quirk, Eric Rhiney
Paperback
R1,408
Discovery Miles 14 080
Hacking Web Intelligence - Open Source…
Sudhanshu Chauhan, Nutan Kumar Panda
Paperback
R1,208
Discovery Miles 12 080
VBA and Macros for Microsoft Office…
Bill Jelen, Tracy Syrstad
Paperback
R1,230
Discovery Miles 12 300
Illustrated Microsoft (R)Office 365…
Elizabeth Reding, Lynn Wermers
Paperback
|