![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > General theory of computing
Because efficient compilation of information allows managers and business leaders to make the best decisions for the financial solvency of their organizations, data analysis is an important part of modern business administration. Understanding the use of analytics, reporting, and data mining in everyday business environments is imperative to the success of modern businesses. Utilizing Big Data Paradigms for Business Intelligence is a pivotal reference source that provides vital research on how to address the challenges of data extraction in business intelligence using the five "Vs" of big data: velocity, volume, value, variety, and veracity. This book is ideally designed for business analysts, investors, corporate managers, entrepreneurs, and researchers in the fields of computer science, data science, and business intelligence.
Technology has opened a wide window of novel communication methods and techniques and has become ubiquitous in modern society. With advancements occurring rapidly and transforming practices and efficiencies within all fields including business, education, medicine, engineering, and so on, it is important to remain up to date on the latest research findings. Human-Computer Interaction and Technology Integration in Modern Society is a critical reference source that examines the integration of technological innovations into every aspect of modern society including education and business. Highlighting important topics that include digitization, human development, knowledge management, and open innovation, this book is ideal for IT specialists, policymakers, professionals, academicians, researchers, practitioners, and students.
A crash course into 8086/8088 assembler programming, in an easy way with practice at each step. You will learn how to use the registers, move data, do arithmetic, and handle text and graphics. You can run these programs on any PC machine and no program exceeds 512 bytes of executable code! The example programs include: * Guess the number. * Tic-Tac-Toe game. * Text graphics. * Mandelbrot set. * F-Bird game. * Invaders game. * Pillman game. * Toledo Atomchess. * bootBASIC language.
In recent years, swarm intelligence has become a popular computational approach among researchers working on optimization problems throughout the globe. Several algorithms inside swarm intelligence have been implemented due to their application to real-world issues and other advantages. A specific procedure, Fireworks Algorithm, is an emerging method that studies the explosion process of fireworks within local areas. Applications of this developing program are undiscovered, and research is necessary for scientists to fully understand the workings of this innovative system. The Handbook of Research on Fireworks Algorithms and Swarm Intelligence is a pivotal reference source that provides vital research on theory analysis, improvements, and applications of fireworks algorithm. While highlighting topics such as convergence rate, parameter applications, and global optimization analysis, this publication explores up-to-date progress on the specific techniques of this algorithm. This book is ideally designed for researchers, data scientists, mathematicians, engineers, software developers, postgraduates, and academicians seeking coverage on this evolutionary computation method.
The recent emergence and prevalence of social network applications, sensor equipped mobile devices, and the availability of large amounts of geo-referenced data have enabled the analysis of new context dimensions that involve individual, social, and urban context. Creating Personal, Social, and Urban Awareness through Pervasive Computing provides an overview of the theories, techniques, and practical applications related to the three dimensions of context awareness. Through the exploration of emerging research trends of pervasive computing, this book is beneficial to professors, students, researchers, and developers interested this latest development in the field of context-awareness and pervasive computing.
DOS Programming at its best! Discover A Book That Tells You What You Should Do and How! Instead of jumping right into the instructions, this book will provide you first with all the necessary concepts that you need to learn in order to make the learning process a whole lot easier. This way, you're sure not to get lost in confusion once you get to the more complex lessons provided in the latter chapters. Graphs and flowcharts, as well as sample codes, are provided for a more visual approach on your learning You will also learn the designs and forms of DOS, and what's more convenient than getting to know both sides! Want to know More? Buy Now!
The theory of computation is used to address challenges arising in many computer science areas such as artificial intelligence, language processors, compiler writing, information and coding systems, programming language design, computer architecture and more. To grasp topics concerning this theory readers need to familiarize themselves with its computational and language models, based on concepts of discrete mathematics including sets, relations, functions, graphs and logic. This handbook introduces with rigor the important concepts of this kind and uses them to cover the most important mathematical models for languages and computation, such as various classical as well as modern automata and grammars. It explains their use in such crucially significant topics of computation theory as computability, decidability, and computational complexity. The authors pay special attention to the implementation of all these mathematical concepts and models and explains clearly how to encode them in computational practice. All computer programs are written in C#.
"Analysis of Turbulent Flows" is written by one of the most prolific authors in the field of CFD. Professor of Aerodynamics at SUPAERO and Director of DMAE at ONERA, Professor Tuncer Cebeci calls on both his academic and industrial experience when presenting this work. Each chapter has been specifically constructed to provide a comprehensive overview of turbulent flow and its measurement. "Analysis of Turbulent Flows" serves as an advanced textbook for PhD candidates working in the field of CFD and is essential reading for researchers, practitioners in industry and MSc and MEng students. The field of CFD is strongly represented by the following
corporate organizations: Boeing, Airbus, Thales, United
Technologies and General Electric. Government bodies and academic
institutions also have a strong interest in this exciting
field.
In businesses and organizations, understanding the social reality of individuals, groups, and cultures allows for in-depth understanding and rich analysis of multiple research areas to improve practices. Qualitative research provides important insight into the interactions of the workplace. Qualitative Techniques for Workplace Data Analysis is an essential reference source that discusses the qualitative methods used to analyze workplace data, as well as what measures should be adopted to ensure the credibility and dependability of qualitative findings in the workplace. Featuring research on topics such as collection methods, content analysis, and sampling, this book is ideally designed for academicians, development practitioners, business managers, and analytic professionals seeking coverage on quality measurement techniques in the occupational settings of emerging markets.
This book identifies and discusses critical issues of ICT innovation at both the macroeconomic and organisational levels, bringing together two hitherto independent fields of study: economics and information systems. The book takes stock of these two fields, highlighting their complementarity in contemporary issues such as business competitiveness and e-commerce, organisational change and industrial restructuring, information systems implementation and technology infrastructure building. The contributions cover a broad range of issues, from analysing policy approaches for fostering ICT innovation at a regional level, to examining the way in which ICT-based information systems and organisational practice are simultaneously shaped. The book elaborates an understanding of innovation as shaped largely in context, rather than 'diffused' from the place of its conception into the place of its implementation. The theoretical perspectives offered by the authors include institutional economics, evolutionary economics, social constructivism, and structuration theory. Collectively, the chapters of this book present ICT innovation as a dynamic process involving multiple actors in multiple locations, codified and tacit knowledge, and instrumental and situated behaviour. This pathbreaking book will be of enormous interest to students, researchers and academics specialising in economics, information systems and ICT innovation, as well as policy and management consultants involved in information systems and development.
Code Nation explores the rise of software development as a social, cultural, and technical phenomenon in American history. The movement germinated in government and university labs during the 1950s, gained momentum through corporate and counterculture experiments in the 1960s and 1970s, and became a broad-based computer literacy movement in the 1980s. As personal computing came to the fore, learning to program was transformed by a groundswell of popular enthusiasm, exciting new platforms, and an array of commercial practices that have been further amplified by distributed computing and the Internet. The resulting society can be depicted as a "Code Nation"-a globally-connected world that is saturated with computer technology and enchanted by software and its creation. Code Nation is a new history of personal computing that emphasizes the technical and business challenges that software developers faced when building applications for CP/M, MS-DOS, UNIX, Microsoft Windows, the Apple Macintosh, and other emerging platforms. It is a popular history of computing that explores the experiences of novice computer users, tinkerers, hackers, and power users, as well as the ideals and aspirations of leading computer scientists, engineers, educators, and entrepreneurs. Computer book and magazine publishers also played important, if overlooked, roles in the diffusion of new technical skills, and this book highlights their creative work and influence. Code Nation offers a "behind-the-scenes" look at application and operating-system programming practices, the diversity of historic computer languages, the rise of user communities, early attempts to market PC software, and the origins of "enterprise" computing systems. Code samples and over 80 historic photographs support the text. The book concludes with an assessment of contemporary efforts to teach computational thinking to young people.
"Windows 2012 Server Network Security "provides the most in-depth guide to deploying and maintaining a secure Windows network. The book drills down into all the new features of Windows 2012 and provides practical, hands-on methods for securing your Windows systems networks, including: Secure remote access Network vulnerabilities and mitigations DHCP installations configuration MAC filtering DNS server security WINS installation configuration Securing wired and wireless connections Windows personal firewall Remote desktop services Internet connection sharing Network diagnostics and troubleshooting Windows network security is of primary importance due to the
sheer volume of data residing on Windows networks. "Windows 2012
Server Network Security "provides network administrators with the
most focused and in-depth coverage of Windows network security
threats along with methods and techniques for securing important
mission-critical networks and assets. The book also covers Windows
8.
Basic Computer Technology for seniors and beginners in a clear and comprehensive format easy to understand.
Information in today's advancing world is rapidly expanding and becoming widely available. This eruption of data has made handling it a daunting and time-consuming task. Natural language processing (NLP) is a method that applies linguistics and algorithms to large amounts of this data to make it more valuable. NLP improves the interaction between humans and computers, yet there remains a lack of research that focuses on the practical implementations of this trending approach. Neural Networks for Natural Language Processing is a collection of innovative research on the methods and applications of linguistic information processing and its computational properties. This publication will support readers with performing sentence classification and language generation using neural networks, apply deep learning models to solve machine translation and conversation problems, and apply deep structured semantic models on information retrieval and natural language applications. While highlighting topics including deep learning, query entity recognition, and information retrieval, this book is ideally designed for research and development professionals, IT specialists, industrialists, technology developers, data analysts, data scientists, academics, researchers, and students seeking current research on the fundamental concepts and techniques of natural language processing.
|
You may like...
Advanced Multiresponse Process…
Tatjana V. Sibalija, Vidosav D. Majstorovic
Hardcover
Fuzzy Technology - Present Applications…
Mikael Collan, Mario Fedrizzi, …
Hardcover
Data Envelopment Analysis with R
Farhad Hosseinzadeh Lotfi, Ali Ebrahimnejad, …
Hardcover
R3,990
Discovery Miles 39 900
Extending the Horizons: Advances in…
Edward K. Baker, Anito Joseph, …
Hardcover
R2,685
Discovery Miles 26 850
|