![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Applications of computing > Databases > General
This book presents the implementation of novel concepts and solutions, which allows to enhance the cyber security of administrative and industrial systems and the resilience of economies and societies to cyber and hybrid threats. This goal can be achieved by rigorous information sharing, enhanced situational awareness, advanced protection of industrial processes and critical infrastructures, and proper account of the human factor, as well as by adequate methods and tools for analysis of big data, including data from social networks, to find best ways to counter hybrid influence. The implementation of these methods and tools is examined here as part of the process of digital transformation through incorporation of advanced information technologies, knowledge management, training and testing environments, and organizational networking. The book is of benefit to practitioners and researchers in the field of cyber security and protection against hybrid threats, as well as to policymakers and senior managers with responsibilities in information and knowledge management, security policies, and human resource management and training.
This book provides an overview of the resources and research projects that are bringing Big Data and High Performance Computing (HPC) on converging tracks. It demystifies Big Data and HPC for the reader by covering the primary resources, middleware, applications, and tools that enable the usage of HPC platforms for Big Data management and processing.Through interesting use-cases from traditional and non-traditional HPC domains, the book highlights the most critical challenges related to Big Data processing and management, and shows ways to mitigate them using HPC resources. Unlike most books on Big Data, it covers a variety of alternatives to Hadoop, and explains the differences between HPC platforms and Hadoop.Written by professionals and researchers in a range of departments and fields, this book is designed for anyone studying Big Data and its future directions. Those studying HPC will also find the content valuable.
Just Enough R! An Interactive Approach to Machine Learning and Analytics presents just enough of the R language, machine learning algorithms, statistical methodology, and analytics for the reader to learn how to find interesting structure in data. The approach might be called "seeing then doing" as it first gives step-by-step explanations using simple, understandable examples of how the various machine learning algorithms work independent of any programming language. This is followed by detailed scripts written in R that apply the algorithms to solve nontrivial problems with real data. The script code is provided, allowing the reader to execute the scripts as they study the explanations given in the text. Features Gets you quickly using R as a problem-solving tool Uses RStudio's integrated development environment Shows how to interface R with SQLite Includes examples using R's Rattle graphical user interface Requires no prior knowledge of R, machine learning, or computer programming Offers over 50 scripts written in R, including several problem-solving templates that, with slight modification, can be used again and again Covers the most popular machine learning techniques, including ensemble-based methods and logistic regression Includes end-of-chapter exercises, many of which can be solved by modifying existing scripts Includes datasets from several areas, including business, health and medicine, and science About the Author Richard J. Roiger is a professor emeritus at Minnesota State University, Mankato, where he taught and performed research in the Computer and Information Science Department for over 30 years.
Cultural forces govern a synergistic relationship among information institutions that shapes their roles collectively and individually. Cultural synergy is the combination of perception- and behavior-shaping knowledge within, between, and among groups. Our hyperlinked era makes information-sharing among institutions critically important for scholarship as well as for the advancement of humankind. Information institutions are those that have, or share in, the mission to preserve, conserve, and disseminate information objects and their informative content. A central idea is the notion of social epistemology that information institutions arise culturally from social forces of the cultures they inhabit, and that their purpose is to disseminate that culture. All information institutions are alike in critical ways. Intersecting lines of cultural mission are trajectories for synergy for allowing us to perceive the universe of information institutions as interconnected and evolving and moving forward in distinct ways for the improvement of the condition of humankind through the building up of its knowledge base and of its information-sharing processes. This book is an exploration of the cultural synergy that can be realized by seeing commonalities among information institutions (sometimes also called cultural heritage institutions): museums, libraries, and archives. The hyperlinked era of the Semantic Web makes information sharing among institutions critically important for scholarship as well as the advancement of mankind. The book addresses the origins of cultural information institutions, the history of the professions that run them, and the social imperative of information organization as a catalyst for semantic synergy.
The central purpose of this collection of essays is to make a creative addition to the debates surrounding the cultural heritage domain. In the 21st century the world faces epochal changes which affect every part of society, including the arenas in which cultural heritage is made, held, collected, curated, exhibited, or simply exists. The book is about these changes; about the decentring of culture and cultural heritage away from institutional structures towards the individual; about the questions which the advent of digital technologies is demanding that we ask and answer in relation to how we understand, collect and make available Europe's cultural heritage. Cultural heritage has enormous potential in terms of its contribution to improving the quality of life for people, understanding the past, assisting territorial cohesion, driving economic growth, opening up employment opportunities and supporting wider developments such as improvements in education and in artistic careers. Given that spectrum of possible benefits to society, the range of studies that follow here are intended to be a resource and stimulus to help inform not just professionals in the sector but all those with an interest in cultural heritage.
This book gathers selected papers from the KES-IDT-2020 Conference, held as a Virtual Conference on June 17-19, 2020. The aim of the annual conference was to present and discuss the latest research results, and to generate new ideas in the field of intelligent decision-making. However, the range of topics discussed during the conference was definitely broader and covered methods in e.g. classification, prediction, data analysis, big data, data science, decision support, knowledge engineering, and modeling in such diverse areas as finance, cybersecurity, economics, health, management and transportation. The Problems in Industry 4.0 and IoT are also addressed. The book contains several sections devoted to specific topics, such as Intelligent Data Processing and its Applications High-Dimensional Data Analysis and its Applications Multi-Criteria Decision Analysis - Theory and Applications Large-Scale Systems for Intelligent Decision-Making and Knowledge Engineering Decision Technologies and Related Topics in Big Data Analysis of Social and Financial Issues Decision-Making Theory for Economics
Large-Scale 3D Data Integration: Challenges and Opportunities examines the fundamental aspects of 3D geo-information, focusing on the latest developments in 3D GIS (geographic information) and AEC (architecture, engineering, construction) systems. This book addresses policy makers, designers and engineers, and individuals that need to overcome obstacles in integrating modeling perspectives and data. Organized into four major parts, the book begins by presenting a historical overview of the issues involved in integrating GIS and AEC. Part II then focuses on the data issue from several viewpoints: data collection; database structures and representation; database management; and visualization. Part III covers the areas of semantics, ontology, and standardization from a theoretical perspective and details many of the best examples of this approach in developing real-world applications. The book concludes with contributions that focus on recent advances in virtual geographic environments and alternative modeling schemes for the potential AEC/GIS interface.
The book proposes techniques, with an emphasis on the financial sector, which will make recommendation systems both accurate and explainable. The vast majority of AI models work like black box models. However, in many applications, e.g., medical diagnosis or venture capital investment recommendations, it is essential to explain the rationale behind AI systems decisions or recommendations. Therefore, the development of artificial intelligence cannot ignore the need for interpretable, transparent, and explainable models. First, the main idea of the explainable recommenders is outlined within the background of neuro-fuzzy systems. In turn, various novel recommenders are proposed, each characterized by achieving high accuracy with a reasonable number of interpretable fuzzy rules. The main part of the book is devoted to a very challenging problem of stock market recommendations. An original concept of the explainable recommender, based on patterns from previous transactions, is developed; it recommends stocks that fit the strategy of investors, and its recommendations are explainable for investment advisers.
This book introduces computational advertising, and Internet monetization. It provides a macroscopic understanding of how consumer products in the Internet era push user experience and monetization to the limit. Part One of the book focuses on the basic problems and background knowledge of online advertising. Part Two targets the product, operations, and sales staff, as well as high-level decision makers of the Internet products. It explains the market structure, trading models, and the main products in computational advertising. Part Three targets systems, algorithms, and architects, and focuses on the key technical challenges of different advertising products. Features * Introduces computational advertising and Internet monetization * Covers data processing, utilization, and trading * Uses business logic as the driving force to explain online advertising products and technology advancement * Explores the products and the technologies of computational advertising, to provide insights on the realization of personalization systems, constrained optimization, data monetization and trading, and other practical industry problems * Includes case studies and code snippets
The Unified Modeling Language is rapidly gaining acceptance as the
mechanism of choice to model complex software systems at various
steps of their specification and design, using a number of
orthogonal views that illustrate use cases, class diagrams and even
detailed state machine-based behaviors of objects. -UML and the Real-time/Embedded Domain, with chapters on the
role of UML in software development and on UML and Real-Time
Systems.
This book offers practical as well as conceptual knowledge of the latest trends, tools, techniques and methodologies of data analytics in smart cities. The smart city is an advanced technological area that is capable of understanding the environment by examining the data to improve the livability. The smart cities allow different kinds of wireless sensors to gather massive amounts, full speed and a broad range of city data. The smart city has a focus on data analytics facilitated through the IoT platforms. There is a need to customize the IoT architecture and infrastructures to address needs in application of specific domains of smart cities such as transportation, traffic, health and, environment. The smart cities will provide next generation development technologies for urbanization that includes the need of environmental sustainability, personalization, mobility, optimum energy utilization, better administrative services and higher quality of life. Each chapter presents the reader with an in-depth investigation regarding the possibility of data analytics perspective in smart cities. The book presents cutting-edge and future perspectives of smart cities, where industry experts, scientists, and scholars exchange ideas and experience about surrounding frontier technologies, breakthrough and innovative solutions and applications.
The increasing penetration of IT in organizations calls for an integrative perspective on enterprises and their supporting information systems. MERODE offers an intuitive and practical approach to enterprise modelling and using these models as core for building enterprise information systems. From a business analyst perspective, benefits of the approach are its simplicity and the possibility to evaluate the consequences of modeling choices through fast prototyping, without requiring any technical experience. The focus on domain modelling ensures the development of a common language for talking about essential business concepts and of a shared understanding of business rules. On the construction side, experienced benefits of the approach are a clear separation between specification and implementation, more generic and future-proof systems, and an improved insight in the cost of changes. A first distinguishing feature is the method's grounding in process algebra provides clear criteria and practical support for model quality. Second, the use of the concept of business events provides a deep integration between structural and behavioral aspects. The clear and intuitive semantics easily extend to application integration (COTS software and Web Services). Students and practitioners are the book's main target audience, as both groups will benefit from its practical advice on how to create complete models which combine structural and behavioral views of a system-to-be and which can readily be transformed into code, and on how to evaluate the quality of those models. In addition, researchers in the area of conceptual or enterprise modelling will find a concise overview of the main findings related to the MERODE project. The work is complemented by a wealth of extra material on the author's web page at KU Leuven, including a free CASE tool with code generator, a collection of cases with solutions, and a set of domain modelling patterns that have been developed on the basis of the method's use in industry and government.
In this complete revision and expansion of his first SQL Puzzles
book, Joe Celko challenges you with his trickiest puzzles and then
helps solve them with a variety of solutions and explanations. Joe
demonstrates the thought processes that are involved in attacking a
problem from an SQL perspective to help advanced database
programmers solve the puzzles you frequently face. These techniques
not only help with the puzzle at hand, but help develop the mindset
needed to solve the many difficult SQL puzzles you face every day.
Of course, part of the fun is to see whether or not you can write
better solutions than Joe s.
This monograph presents a collection of major developments leading toward the implementation of white space technology - an emerging wireless standard for using wireless spectrum in locations where it is unused by licensed users. Some of the key research areas in the field are covered. These include emerging standards, technical insights from early pilots and simulations, software defined radio platforms, geo-location spectrum databases and current white space spectrum usage in India and South Africa.
This book constitutes the refereed proceedings of the 36th IFIP TC 11 International Conference on Information Security and Privacy Protection, SEC 2021, held in Oslo, Norway, in June 2021.*The 28 full papers presented were carefully reviewed and selected from 112 submissions. The papers present novel research on theoretical and practical aspects of security and privacy protection in ICT systems. They are organized in topical sections on digital signatures; vulnerability management; covert channels and cryptography; application and system security; privacy; network security; machine learning for security; and security management. *The conference was held virtually.
This book starts from the relationship between urban built environment and travel behavior and focuses on analyzing the origin of traffic phenomena behind the data through multi-source traffic big data, which makes the book unique and different from the previous data-driven traffic big data analysis literature. This book focuses on understanding, estimating, predicting, and optimizing mobility patterns. Readers can find multi-source traffic big data processing methods, related statistical analysis models, and practical case applications from this book. This book bridges the gap between traffic big data, statistical analysis models, and mobility pattern analysis with a systematic investigation of traffic big data's impact on mobility patterns and urban planning.
The internet has launched the world into an era into which enormous amounts of data are generated every day through technologies with both positive and negative consequences. This often refers to big data . This book explores big data in organisations operating in the criminology and criminal justice fields. Big data entails a major disruption in the ways we think about and do things, which certainly applies to most organisations including those operating in the criminology and criminal justice fields. Big data is currently disrupting processes in most organisations - how different organisations collaborate with one another, how organisations develop products or services, how organisations can identify, recruit, and evaluate talent, how organisations can make better decisions based on empirical evidence rather than intuition, and how organisations can quickly implement any transformation plan, to name a few. All these processes are important to tap into, but two underlying processes are critical to establish a foundation that will permit organisations to flourish and thrive in the era of big data - creating a culture more receptive to big data and implementing a systematic data analytics-driven process within the organisation. Written in a clear and direct style, this book will appeal to students and scholars in criminology, criminal justice, sociology, and cultural studies but also to government agencies, corporate and non-corporate organisations, or virtually any other institution impacted by big data.
Digital forensics deals with the acquisition, preservation, examination, analysis and presentation of electronic evidence. Computer networks, cloud computing, smartphones, embedded devices and the Internet of Things have expanded the role of digital forensics beyond traditional computer crime investigations. Practically every crime now involves some aspect of digital evidence; digital forensics provides the techniques and tools to articulate this evidence in legal proceedings. Digital forensics also has myriad intelligence applications; furthermore, it has a vital role in cyber security -- investigations of security breaches yield valuable information that can be used to design more secure and resilient systems. Advances in Digital Forensics XVI describes original research results and innovative applications in the discipline of digital forensics. In addition, it highlights some of the major technical and legal issues related to digital evidence and electronic crime investigations. The areas of coverage include: themes and issues, forensic techniques, filesystem forensics, cloud forensics, social media forensics, multimedia forensics, and novel applications. This book is the sixteenth volume in the annual series produced by the International Federation for Information Processing (IFIP) Working Group 11.9 on Digital Forensics, an international community of scientists, engineers and practitioners dedicated to advancing the state of the art of research and practice in digital forensics. The book contains a selection of sixteen edited papers from the Sixteenth Annual IFIP WG 11.9 International Conference on Digital Forensics, held in New Delhi, India, in the winter of 2020. Advances in Digital Forensics XVI is an important resource for researchers, faculty members and graduate students, as well as for practitioners and individuals engaged in research and development efforts for the law enforcement and intelligence communities.
Vorwort In der Natur entwickelten sich die Echtzeitsysteme seit einigen 100 Mil- Honen Jahren. Tierische Nervensysteme haben zur Aufgabe, auf die Nachrichten aus der Umwelt die Steuerungsbefehle an die aktiven Or- gane zu geben. Dabei spielen zum Beispiel bedingte Reflexe eine wichtige Rolle. Vielleicht kann man die Entstehung des Menschen etwa zu der Zeit ansetzen, als sein sich allmahlich entwickelndes Gehirn Gedanken entwickelte, deren Bedeutung in vorausplanender Weise iiber die gerade vorliegende Situation hinausging. Das fiihrte schliesslich unter anderem zum heutigen Wissenschaftler, der seine Theorien und Systeme aufgrund langwieriger Uberlegungen aufbaut. Die Entwicklung der Computer ging im wesentlichen den umgekehrten Weg. Zunachst diente sie nur der Durchfiihrung "starrer" Programme, wie z.B. das erste programmgesteuerte Rechengerat Z3, das der Unterzeichner im Jahre 1941 vorfiihren konnte. Es folgte unter an- derem ein Spezialgerat zur Fliigelvermessung, das man als den ersten Prozessrechner bezeichnen kann. Es wurden etwa vierzig als Analog- Digital-Wandler arbeitende Messuhren yom Rechnerautomaten abgele- sen und im Rahmen eines Programms als Variable verarbeitet. Abel' auch das erfolgte noch in starrer Reihenfolge. Die echte Prozesssteuerung - heute auch Echtzeitsysteme genannt - erfordert aber ein Reagieren auf bestandig wechselnde Situationen.
This volume explores from a legal perspective, how blockchain works. Perhaps more than ever before, this new technology requires us to take a multidisciplinary approach. The contributing authors, which include distinguished academics, public officials from important national authorities, and market operators, discuss and demonstrate how this technology can be a driver of innovation and yield positive effects in our societies, legal systems and economic/financial system. In particular, they present critical analyses of the potential benefits and legal risks of distributed ledger technology, while also assessing the opportunities offered by blockchain, and possible modes of regulating it. Accordingly, the discussions chiefly focus on the law and governance of blockchain, and thus on the paradigm shift that this technology can bring about.
The textbook covers the main aspects of Edge Computing, from a thorough look at the technology to the standards and industry associations working in the field. The book is conceived as a textbook for graduate students but also functions as a working guide for developers, engineers, and researchers. The book aims not only at providing a comprehensive technology and standard reference overview for students, but also useful research insights and practical exercises for edge software developers and investigators in the area (and for students looking to apply their skills). A particular emphasis is given Multi-access Edge Computing (MEC) as defined in European Telecommunications Standards Institute (ETSI), in relationship with other standard organizations like 3GPP, thus in alignment with the recent industry efforts to produce harmonized standards for edge computing leveraging both ETSI ISG MEC and 3GPP specifications. Practical examples of Edge Computing implementation from industry groups, associations, companies and edge developers, complete the book and make it useful for students entering the field. The book includes exercises, examples, and quizzes throughout.
This book, the first volume, highlights 8 out of a total of about 36 megacities in the World which by definition have 10 million inhabitants. The cities/chapters presented in this book are based on recent advance such as the wide use of ICT, IOT, e-Governance, e-Democracy, smart economy and flattening and acceleration of the world that is taking place in recent times as reported by 3 times Pulitzer Prize Winner Thomas Friedman. It therefor departs from other ideologies where only a certain megacity qualifies for the title of smart global megacities while in reality every megacity can, and presents how smart global megacities can be created.
As Web-based systems and e-commerce carry businesses into the 21st century, databases are becoming workhorses that shoulder each and every online transaction. For organizations to have effective 24/7 Web operations, they need powerhouse databases that deliver at peak performance-all the time. High Performance Web Databases: Design, Development, and Deployment arms you with every essential technique from design and modeling to advanced topics such as data conversion, performance tuning, Web access and interfacing legacy systems, and security
This book negotiates the hyper dimensions of the Internet through stories from myriads of Web sites, with its fluent presentation and simple but chronological organization of topics highlighting numerous opportunities and providing a solid starting point not only for inexperienced entrepreneurs and managers but anyone interested in applying information technology in business through real or virtual enterprise networks to date. "A Manager's Primer on e-Networking" is an easy to follow primer on modern enterprise networking that every manager needs to read.
This edited book first consolidates the results of the EU-funded EDISON project (Education for Data Intensive Science to Open New science frontiers), which developed training material and information to assist educators, trainers, employers, and research infrastructure managers in identifying, recruiting and inspiring the data science professionals of the future. It then deepens the presentation of the information and knowledge gained to allow for easier assimilation by the reader. The contributed chapters are presented in sequence, each chapter picking up from the end point of the previous one. After the initial book and project overview, the chapters present the relevant data science competencies and body of knowledge, the model curriculum required to teach the required foundations, profiles of professionals in this domain, and use cases and applications. The text is supported with appendices on related process models. The book can be used to develop new courses in data science, evaluate existing modules and courses, draft job descriptions, and plan and design efficient data-intensive research teams across scientific disciplines. |
You may like...
Digital Transformation of Collaboration…
Aleksandra Przegalinska, Francesca Grippa, …
Hardcover
R2,915
Discovery Miles 29 150
Data Science and Simulation in…
Davy Janssens, Ansar-Ul-Haque Yasar, …
Hardcover
R4,505
Discovery Miles 45 050
Brian O'Doherty/Patrick Ireland: Word…
Christa-Maria Lerm Hayes
Paperback
R782
Discovery Miles 7 820
|