![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > General theory of computing > General
A crash course into 8086/8088 assembler programming, in an easy way with practice at each step. You will learn how to use the registers, move data, do arithmetic, and handle text and graphics. You can run these programs on any PC machine and no program exceeds 512 bytes of executable code! The example programs include: * Guess the number. * Tic-Tac-Toe game. * Text graphics. * Mandelbrot set. * F-Bird game. * Invaders game. * Pillman game. * Toledo Atomchess. * bootBASIC language.
Information in today's advancing world is rapidly expanding and becoming widely available. This eruption of data has made handling it a daunting and time-consuming task. Natural language processing (NLP) is a method that applies linguistics and algorithms to large amounts of this data to make it more valuable. NLP improves the interaction between humans and computers, yet there remains a lack of research that focuses on the practical implementations of this trending approach. Neural Networks for Natural Language Processing is a collection of innovative research on the methods and applications of linguistic information processing and its computational properties. This publication will support readers with performing sentence classification and language generation using neural networks, apply deep learning models to solve machine translation and conversation problems, and apply deep structured semantic models on information retrieval and natural language applications. While highlighting topics including deep learning, query entity recognition, and information retrieval, this book is ideally designed for research and development professionals, IT specialists, industrialists, technology developers, data analysts, data scientists, academics, researchers, and students seeking current research on the fundamental concepts and techniques of natural language processing.
With the rapid development of big data, it is necessary to transfer the massive data generated by end devices to the cloud under the traditional cloud computing model. However, the delays caused by massive data transmission no longer meet the requirements of various real-time mobile services. Therefore, the emergence of edge computing has been recently developed as a new computing paradigm that can collect and process data at the edge of the network, which brings significant convenience to solving problems such as delay, bandwidth, and off-loading in the traditional cloud computing paradigm. By extending the functions of the cloud to the edge of the network, edge computing provides effective data access control, computation, processing and storage for end devices. Furthermore, edge computing optimizes the seamless connection from the cloud to devices, which is considered the foundation for realizing the interconnection of everything. However, due to the open features of edge computing, such as content awareness, real-time computing and parallel processing, the existing problems of privacy in the edge computing environment have become more prominent. The access to multiple categories and large numbers of devices in edge computing also creates new privacy issues. In this book, we discuss on the research background and current research process of privacy protection in edge computing. In the first chapter, the state-of-the-art research of edge computing are reviewed. The second chapter discusses the data privacy issue and attack models in edge computing. Three categories of privacy preserving schemes will be further introduced in the following chapters. Chapter three introduces the context-aware privacy preserving scheme. Chapter four further introduces a location-aware differential privacy preserving scheme. Chapter five presents a new blockchain based decentralized privacy preserving in edge computing. Chapter six summarize this monograph and propose future research directions. In summary, this book introduces the following techniques in edge computing: 1) describe an MDP-based privacy-preserving model to solve context-aware data privacy in the hierarchical edge computing paradigm; 2) describe a SDN based clustering methods to solve the location-aware privacy problems in edge computing; 3) describe a novel blockchain based decentralized privacy-preserving scheme in edge computing. These techniques enable the rapid development of privacy-preserving in edge computing.
>Great new publication with first time ever released professional programming! Are you aware that C Programming is one of the most popular and most commonly used programming languages today? Did you know many expert developers have started with learning C in order to become knowledgeable in computer programming? Were you aware that your children are learning C Programming today in their schools? If you are having doubts learning the language, do not! C is actually easy to learn. Compared to C++, C is much simpler! You do not need to spend years to become a master of this language. Well start right here! Learn the coding necessary in less than a day, become profound and knowledgeable to move up the ladder to becoming a proficient programmer! It start right now and by the time you finish and implement the steps here, you will have learned everything there is to know in less than a day!
Information communication technologies (ICT) have long been important in supporting doctoral study. Though ICTs have been integrated into educational practices at all levels, there is little understanding of how effective these technologies are in supporting resource development for students and researchers in academic institutions. Enhancing the Role of ICT in Doctoral Research Processes is a collection of innovative research that identifies the ways that doctoral supervisors and students perceive the role of ICTs within the doctoral research process and supports the development of guidelines to enhance ICT skills within these programs. While highlighting topics including professional development, online learning, and ICT management, this book is ideally designed for academicians, researchers, and professionals seeking current research on ICT use for doctoral research.
As real-time and integrated systems become increasingly sophisticated, issues related to development life cycles, non-recurring engineering costs, and poor synergy between development teams will arise. The Handbook of Research on Embedded Systems Design provides insights from the computer science community on integrated systems research projects taking place in the European region. This premier references work takes a look at the diverse range of design principles covered by these projects, from specification at high abstraction levels using standards such as UML and related profiles to intermediate design phases. This work will be invaluable to designers of embedded software, academicians, students, practitioners, professionals, and researchers working in the computer science industry.
Due to the growing use of web applications and communication devices, the use of data has increased throughout various industries. It is necessary to develop new techniques for managing data in order to ensure adequate usage. The Handbook of Research on Pattern Engineering System Development for Big Data Analytics is a critical scholarly resource that examines the incorporation of pattern management in business technologies as well as decision making and prediction process through the use of data management and analysis. Featuring coverage on a broad range of topics such as business intelligence, feature extraction, and data collection, this publication is geared towards professionals, academicians, practitioners, and researchers seeking current research on the development of pattern management systems for business applications.
The book collects 3 years of researches in the penetration testing security field. It does not describe underground or fancy techniques, it is most focused on the state of the art in penetration testing methodologies. In other words, if you need to test a system, how do you do ? What is the first step ? What tools can be used ? what is the path to follow in order to find flaws ? The book shows many real world examples on how the described methodology has been used. For example: penetration testing on electronic voting machines, how malware did use the describe methodology to bypass common security mechanisms and attacks to reputation systems.
C++ Programming Professional Made Easy: Expert C++ Programming Language Success in a Day for Any Computer User! Want to take your programming to the next level! Sam Key right back at providing his expert book from his great foundation food of c programming Did you love his first technical book? Well now you can take it up one notch! Know the basics and you want to get right into Variables and Operators? Discouraged to learn all the User Inputs Lets master Flow Controls! Grab your copy today and let's dive right in! PURCHASE NOW YOUR COPY!
Digital collaboration is abundant in today's world, but it is often problematic and does not provide an apt solution to the human need for comprehensive communication. Humans require more personal interactions beyond what can be achieved online. Returning to Interpersonal Dialogue and Understanding Human Communication in the Digital Age is a collection of innovative studies on the methods and applications of comparing online human interactions to face-to-face interactions. While highlighting topics including digital collaboration, social media, and privacy, this book is a vital reference source for public administrators, educators, businesses, academicians, and researchers seeking current research on the importance of non-digital communication between people.
Simulating for a crisis is far more than creating a simulation of a crisis situation. In order for a simulation to be useful during a crisis, it should be created within the space of a few days to allow decision makers to use it as quickly as possible. Furthermore, during a crisis the aim is not to optimize just one factor, but to balance various, interdependent aspects of life. In the COVID-19 crisis, decisions had to be made concerning e.g. whether to close schools and restaurants, and the (economic) consequences of a 3 or 4-week lock-down had to be considered. As such, rather than one simulation focusing on a very limited aspect, a framework allowing the simulation of several different scenarios focusing on different aspects of the crisis was required. Moreover, the results of the simulations needed to be easily understandable and explainable: if a simulation indicates that closing schools has no effect, this can only be used if the decision makers can explain why this is the case. This book describes how a simulation framework was created for the COVID-19 crisis, and demonstrates how it was used to simulate a wide range of scenarios that were relevant for decision makers at the time. It also discusses the usefulness of the approach, and explains the decisions that had to be made along the way as well as the trade-offs. Lastly, the book examines the lessons learned and the directions for the further development of social simulation frameworks to make them better suited to crisis situations, and to foster a more resilient society.
Technology's presence in society continues to increase as new products and programs emerge. As such, it is vital for various industries to rapidly adapt and learn to incorporate the latest technology applications and tools. The Handbook of Research on Technology Integration in the Global World is an essential reference source that examines a variety of approaches to integrating technology through technology diffusion, e-collaboration, and e-adoption. The book explores topics such as information systems agility, semantic web, and the digital divide. This publication is a valuable resource for academicians, practitioners, researchers, and upper-level graduate students.
"Extended Finite Element Method" provides an introduction to the extended finite element method (XFEM), a novel computational method which has been proposed to solve complex crack propagation problems. The book helps readers understand the method and make effective use of the XFEM code and software plugins now available to model and simulate these complex problems. The book explores the governing equation behind XFEM, including
level set method and enrichment shape function. The authors outline
a new XFEM algorithm based on the continuum-based shell and
consider numerous practical problems, including planar
discontinuities, arbitrary crack propagation in shells and dynamic
response in 3D composite materials.
Personal technology continues to evolve every day, but business technology does not follow that trend. Business IT is often treated as a necessary evil that can't be relied upon to take companies to the next level in their corporate evolution. In "The Golden Age of Drive-Thru IT," author Kedar Sathe offers useful, wide-ranging, and imaginative advice about how to revive and strengthen IT departments. Sathe, who has been programming computers since age fourteen, discusses how businesses must establish and execute new IT strategies to maintain and increase their bottom line. "The Golden Age of Drive-Thru IT" describes various aspects of technology and how IT can rise to every occasion and become a strategic enabler. It shows how IT can become nimble and flexible, yet produce robust and graceful solutions that allow companies to drive toward success in an efficient and enriching fashion. "The Golden Age of Drive-Thru IT" communicates how innovative ideas and smart, enthusiastic contributors will allow IT transformations to take place, reinvent itself, rise to its true potential, and stop selling itself short.
The recent emergence and prevalence of social network applications, sensor equipped mobile devices, and the availability of large amounts of geo-referenced data have enabled the analysis of new context dimensions that involve individual, social, and urban context. Creating Personal, Social, and Urban Awareness through Pervasive Computing provides an overview of the theories, techniques, and practical applications related to the three dimensions of context awareness. Through the exploration of emerging research trends of pervasive computing, this book is beneficial to professors, students, researchers, and developers interested this latest development in the field of context-awareness and pervasive computing.
This book provides application of multi criteria decision making techniques for managerial discretion. With this book, a concerted platform has been provided for several peers and other management organizations to understand and implement these tools and deal with the practical problems in a better way so as to provide more robust managerial decision making.
Method engineering is a very young field. Generally, method engineering can be considered from engineering of an entire methodology for information systems development to engineering of modeling techniques according to project requirements. Computer aided method engineering is about generation and use of information systems design techniques according to user needs. Some times such environments are called generic tools or MetaCASE. Computer-Aided Method Engineering: Designing Case Repositories for the 21st Century presents a contribution on a methodology and architecture of a CASE repository, forwarding a theory that will bring about the component based development into CASE tool design and development covering a repository construction principle for the 21st century.
How well does your organization manage the risks associated with
information quality? Managing information risk is becoming a top
priority on the organizational agenda. The increasing
sophistication of IT capabilities along with the constantly
changing dynamics of global competition are forcing businesses to
make use of their information more effectively. Information is
becoming a core resource and asset for all organizations; however,
it also brings many potential risks to an organization, from
strategic, operational, financial, compliance, and environmental to
societal. If you continue to struggle to understand and measure how
information and its quality affects your business, this book is for
you. This reference is in direct response to the new challenges
that all managers have to face. Our process helps your organization
to understand the "pain points" regarding poor data and information
quality so you can concentrate on problems that have a high impact
on core business objectives. This book provides you with all the
fundamental concepts, guidelines and tools to ensure core business
information is identified, protected and used effectively, and
written in a language that is clear and easy to understand for
non-technical managers. |
![]() ![]() You may like...
Discovering Computers, Essentials…
Susan Sebok, Jennifer Campbell, …
Paperback
Dynamic Web Application Development…
David Parsons, Simon Stobart
Paperback
Infinite Words, Volume 141 - Automata…
Dominique Perrin, Jean-Eric Pin
Hardcover
R4,119
Discovery Miles 41 190
Computer-Graphic Facial Reconstruction
John G. Clement, Murray K. Marks
Hardcover
R2,349
Discovery Miles 23 490
|