![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > General theory of computing > Data structures
Beginning Oracle SQL is your introduction to the interactive query tools and specific dialect of SQL used with Oracle Database. These tools include SQL*Plus and SQL Developer. SQL*Plus is the one tool any Oracle developer or database administrator can always count on, and it is widely used in creating scripts to automate routine tasks. SQL Developer is a powerful, graphical environment for developing and debugging queries. Oracle's is possibly the most valuable dialect of SQL from a career standpoint. Oracle's database engine is widely used in corporate environments worldwide. It is also found in many government applications. Oracle SQL implements many features not found in competing products. No developer or DBA working with Oracle can afford to be without knowledge of these features and how they work, because of the performance and expressiveness they bring to the table. Written in an easygoing and example-based style, Beginning Oracle SQL is the book that will get you started down the path to successfully writing SQL statements and getting results from Oracle Database. Takes an example-based approach, with clear and authoritative explanations Introduces both SQL and the query tools used to execute SQL statements Shows how to create tables, populate them with data, and then query that data to generate business results
This revised and extensively expanded edition of "Computability and Complexity Theory" comprises essential materials that are core knowledge in the theory of computation. The book is self-contained, with a preliminary chapter describing key mathematical concepts and notations. Subsequent chapters move from the qualitative aspects of classical computability theory to the quantitative aspects of complexity theory. Dedicated chapters on undecidability, NP-completeness, andrelative computability focus on the limitations of computability and the distinctions between feasible and intractable. Substantial new content in this edition includes: a chapter on nonuniformity studying Boolean circuits, advice classes and the important result of Karp Lipton.a chapter studying properties of the fundamental probabilistic complexity classesa study of the alternating Turing machine and uniform circuit classes. an introduction of counting classes, proving the famous results of Valiant and Vazirani and of Todaa thorough treatment of the proof that IP is identical to PSPACE With its accessibility and well-devised organization, this text/reference is an excellent resource and guide for those looking to develop a solid grounding in the theory of computing. Beginning graduates, advanced undergraduates, and professionals involved in theoretical computer science, complexity theory, and computability will find the book an essential and practical learning tool. Topics and features: Concise, focused materials cover the most fundamental concepts and results in the field of modern complexity theory, including the theory of NP-completeness, NP-hardness, the polynomial hierarchy, and complete problems for other complexity classes Contains information that otherwise exists only in research literature and presents it in a unified, simplified mannerProvides key mathematical background information, including sections on logic and number theory and algebra Supported by numerous exercises and supplementary problems for reinforcement and self-study purposes "
This book constitutes the refereed proceedings of the 11th Latin American Symposium on Theoretical Informatics, LATIN 2014, held in Montevideo, Uruguay, in March/April 2014. The 65 papers presented together with 5 abstracts were carefully reviewed and selected from 192 submissions. The papers address a variety of topics in theoretical computer science with a certain focus on complexity, computational geometry, graph drawing, automata, computability, algorithms on graphs, algorithms, random structures, complexity on graphs, analytic combinatorics, analytic and enumerative combinatorics, approximation algorithms, analysis of algorithms, computational algebra, applications to bioinformatics, budget problems and algorithms and data structures.
This book constitutes the refereed proceedings of the 18th Annual International Conference on Research in Computational Molecular Biology, RECOMB 2014, held in Pittsburgh, PA, USA, in April 2014. The 35 extended abstracts were carefully reviewed and selected from 154 submissions. They report on original research in all areas of computational molecular biology and bioinformatics.
The importance of benchmarking in the service sector is well recognized as it helps in continuous improvement in products and work processes. Through benchmarking, companies have strived to implement best practices in order to remain competitive in the product- market in which they operate. However studies on benchmarking, particularly in the software development sector, have neglected using multiple variables and therefore have not been as comprehensive. Information Theory and Best Practices in the IT Industry fills this void by examining benchmarking in the business of software development and studying how it is affected by development process, application type, hardware platforms used, and many other variables. Information Theory and Best Practices in the IT Industry begins by examining practices of benchmarking productivity and critically appraises them. Next the book identifies different variables which affect productivity and variables that affect quality, developing useful equations that explaining their relationships. Finally these equations and findings are applied to case studies. Utilizing this book, practitioners can decide about what emphasis they should attach to different variables in their own companies, while seeking to optimize productivity and defect density.
Diversity is characteristic of the information age and also of statistics. To date, the social sciences have contributed greatly to the development of handling data under the rubric of measurement, while the statistical sciences have made phenomenal advances in theory and algorithms. Measurement and Multivariate Analysis promotes an effective interplay between those two realms of research-diversity with unity. The union and the intersection of those two areas of interest are reflected in the papers in this book, drawn from an international conference in Banff, Canada, with participants from 15 countries. In five major categories - scaling, structural analysis, statistical inference, algorithms, and data analysis - readers will find a rich variety of topics of current interest in the extended statistical community.
Semidefinite programs constitute one of the largest classes of optimization problems that can be solved with reasonable efficiency - both in theory and practice. They play a key role in a variety of research areas, such as combinatorial optimization, approximation algorithms, computational complexity, graph theory, geometry, real algebraic geometry and quantum computing. This book is an introduction to selected aspects of semidefinite programming and its use in approximation algorithms. It covers the basics but also a significant amount of recent and more advanced material. There are many computational problems, such as MAXCUT, for which one cannot reasonably expect to obtain an exact solution efficiently, and in such case, one has to settle for approximate solutions. For MAXCUT and its relatives, exciting recent results suggest that semidefinite programming is probably the ultimate tool. Indeed, assuming the Unique Games Conjecture, a plausible but as yet unproven hypothesis, it was shown that for these problems, known algorithms based on semidefinite programming deliver the best possible approximation ratios among all polynomial-time algorithms. This book follows the "semidefinite side" of these developments, presenting some of the main ideas behind approximation algorithms based on semidefinite programming. It develops the basic theory of semidefinite programming, presents one of the known efficient algorithms in detail, and describes the principles of some others. It also includes applications, focusing on approximation algorithms."
Fine pitch high lead count integrated circuit packages represent a dramatic change from the conventional methods of assembling electronic components to a printed interconnect circuit board. To some, these FPTpackages appear to bean extension of the assembly technology called surface mount or SMT. Many of us who have spent a significant amount of time developing the process and design techniques for these fine pitchpackages haveconcluded that these techniquesgobeyondthose commonly useed for SMT. In 1987 the presentauthor, convincedofthe uniqueness ofthe assembly and design demands ofthese packages, chaired ajoint committee where the members agreed to use fine pitch technology (FPT) as the defining term for these demands. The committee was unique in several ways, one being that it was the first time three U. S. standards organizations, the IPC (Lincolnwood, IL), theEIA(Washington, D. C. ),and theASTM (Philadelphia),cametogether tocreate standards before a technology was in high demand. The term fine pitch technology and its acronym FPT have since become widely accepted in the electronics industry. The knowledge of the terms and demands of FPT currently exceed the usage of FPT packaged components, but this is changing rapidly because of the size, performance, and cost savings of FPT. I have resisted several past invitations to write other technical texts. However, I feel there are important advantages and significant difficulties to be encountered with FPT.
Solders have given the designer of modern consumer, commercial, and military electronic systems a remarkable flexibility to interconnect electronic components. The properties of solder have facilitated broad assembly choices that have fueled creative applications to advance technology. Solder is the electrical and me chanical "glue" of electronic assemblies. This pervasive dependency on solder has stimulated new interest in applica tions as well as a more concerted effort to better understand materials properties. We need not look far to see solder being used to interconnect ever finer geo metries. Assembly of micropassive discrete devices that are hardly visible to the unaided eye, of silicon chips directly to ceramic and plastic substrates, and of very fine peripheral leaded packages constitute a few of solder's uses. There has been a marked increase in university research related to solder. New electronic packaging centers stimulate applications, and materials engineering and science departments have demonstrated a new vigor to improve both the materials and our understanding of them. Industrial research and development continues to stimulate new application, and refreshing new packaging ideas are emerging. New handbooks have been published to help both the neophyte and seasoned packaging engineer.
In many real-world problems, rare categories (minority classes) play essential roles despite their extreme scarcity. The discovery, characterization and prediction of rare categories of rare examples may protect us from fraudulent or malicious behavior, aid scientific discovery, and even save lives. This book focuses on rare category analysis, where the majority classes have smooth distributions, and the minority classes exhibit the compactness property. Furthermore, it focuses on the challenging cases where the support regions of the majority and minority classes overlap. The author has developed effective algorithms with theoretical guarantees and good empirical results for the related techniques, and these are explained in detail. The book is suitable for researchers in the area of artificial intelligence, in particular machine learning and data mining.
Near Field Communication is a radio frequency technology that allows objects, such as mobile phones, computers, tags, or posters, to exchange information wirelessly across a small distance. This report on the progress of Near Field Communication reviews the features and functionality of the technology and summarizes the broad spectrum of its current and anticipated applications. We explore the development of NFC technology in recent years, introduce the major stakeholders in the NFC ecosystem, and project its movement toward mainstream adoption. Several examples of early implementation of NFC in libraries are highlighted, primarily involving the use of NFC to enhance discovery by linking books or other physical objects with digital information about library resources, but also including applications of NFC to collection management and self-checkout. Future uses of NFC in libraries, such as smart posters or other enhanced outreach, are envisioned as well as the potential for the "touch paradigm" and "Internet of things" to transform the ways in which library users interact with the information environment. Conscious of the privacy and security of our patrons, we also address continuing concerns related to NFC technology and its expected applications, recommending caution, awareness, and education as immediate next steps for librarians.
Scientific Workflow has seen massive growth in recent years as science becomes increasingly reliant on the analysis of massive data sets and the use of distributed resources. The workflow programming paradigm is seen as a means of managing the complexity in defining the analysis, executing the necessary computations on distributed resources, collecting information about the analysis results, and providing means to record and reproduce the scientific analysis. Workflows for e-Science presents an overview of the current state of the art in the field. It brings together research from many of leading computer scientists in the workflow area and provides real world examples from domain scientists actively involved in e-Science. The computer science topics addressed in the book provide a broad overview of active research focusing on the areas of workflow representations and process models, component and service-based workflows, standardization efforts, workflow frameworks and tools, and problem solving environments and portals. The topics covered represent a broad range of scientific workflow and will be of interest to a wide range of computer science researchers, domain scientists interested in applying workflow technologies in their work, and engineers wanting to develop workflow systems and tools. As such Workflows for e-Science is an invaluable resource for potential or existing users of workflow technologies and a benchmark for developers and researchers. Ian Taylor is Lecturer in Computer Science at Cardiff University, and coordinator of Triana activities at Cardiff. He is the author of "From P2P to Web Services and Grids," also published by Springer. Ewa Deelman is a Research Assistant Professor at the USC Computer Science Department and a Research Team Leader at the Center for Grid Technologies at the USC Information Sciences Institute. Dennis Gannon is a professor of Computer Science in the School of Informatics at Indiana University. He is also Science Director for the Indiana Pervasive Technology Labs.. Dr Shields is a research associate at Cardiff and one of two lead developers for the Triana project.
The book offers an original view on channel coding, based on a unitary approach to block and convolutional codes for error correction. It presents both new concepts and new families of codes. For example, lengthened and modified lengthened cyclic codes are introduced as a bridge towards time-invariant convolutional codes and their extension to time-varying versions. The novel families of codes include turbo codes and low-density parity check (LDPC) codes, the features of which are justified from the structural properties of the component codes. Design procedures for regular LDPC codes are proposed, supported by the presented theory. Quasi-cyclic LDPC codes, in block or convolutional form, represent one of the most original contributions of the book. The use of more than 100 examples allows the reader gradually to gain an understanding of the theory, and the provision of a list of more than 150 definitions, indexed at the end of the book, permits rapid location of sought information.
This book constitutes the proceedings of the 8th International Symposium on Foundations of Information and Knowledge Systems, FoIKS 2014, held in Bordeaux, France, in March 2014. The 14 revised full papers presented together with 5 revised short papers and two invited talks were carefully reviewed and selected from 52 submissions. The papers address various topics such as database design, dynamics of information, information fusion, integrity and constraint management, intelligent agents, knowledge discovery and information retrieval, knowledge representation, reasoning and planning, logics in databases and AI, mathematical foundations, security in information and knowledge Systems, semi-structured data and XML, social computing, the semantic Web and knowledge management as well as the WWW.
This book constitutes the thoroughly refereed post-conference proceedings of the 14th International Conference on Membrane Computing, CMC 2013, held in Chi in u, Republic of Moldova, in August 2013. The 16 revised selected papers presented together with 6 invited lectures were carefully reviewed and selected from 26 papers presented at the conference. Membrane computing is an area of computer science aiming to abstract computing ideas and models from the structure and the functioning of living cells, as well as from the way the cells are organized in tissues or higher order structures. It deals with membrane systems, also called P systems, which are distributed and parallel algebraic models processing multi sets of objects in a localized manner (evolution rules and evolving objects are encapsulated into compartments delimited by membranes), with an essential role played by the communication among compartments and with the environment."
Designing Sorting Networks: A New Paradigm provides an in-depth
guide to maximizing the efficiency of sorting networks, and uses
0/1 cases, partially ordered sets and Haase diagrams to closely
analyze their behavior in an easy, intuitive manner.
The Euclidean shortest path (ESP) problem asks the question: what is the path of minimum length connecting two points in a 2- or 3-dimensional space? Variants of this industrially-significant computational geometry problem also require the path to pass through specified areas and avoid defined obstacles. This unique text/reference reviews algorithms for the exact or approximate solution of shortest-path problems, with a specific focus on a class of algorithms called rubberband algorithms. Discussing each concept and algorithm in depth, the book includes mathematical proofs for many of the given statements. Suitable for a second- or third-year university algorithms course, the text enables readers to understand not only the algorithms and their pseudocodes, but also the correctness proofs, the analysis of time complexities, and other related topics. Topics and features: provides theoretical and programming exercises at the end of each chapter; presents a thorough introduction to shortest paths in Euclidean geometry, and the class of algorithms called rubberband algorithms; discusses algorithms for calculating exact or approximate ESPs in the plane; examines the shortest paths on 3D surfaces, in simple polyhedrons and in cube-curves; describes the application of rubberband algorithms for solving art gallery problems, including the safari, zookeeper, watchman, and touring polygons route problems; includes lists of symbols and abbreviations, in addition to other appendices. This hands-on guide will be of interest to undergraduate students in computer science, IT, mathematics, and engineering. Programmers, mathematicians, and engineers dealing with shortest-path problems in practical applications will also find the book a useful resource.
This book constitutes the refereed proceedings of the 17th International Conference on Practice and Theory in Public-Key Cryptography, PKC 2014, held in Buenos Aires, Argentina, in March 2014. The 38 papers presented were carefully reviewed and selected from 145 submissions. The papers are organized in topical sections on chosen ciphertext security, re-encryption, verifiable outsourcing, cryptanalysis, identity and attribute-based encryption, enhanced encryption, signature schemes, related-key security, functional authentication, quantum impossibility, privacy, protocols.
This book constitutes the proceedings of the 20th International Conference on Tools and Algorithms for the Construction and Analysis of Systems, TACAS 2014, which took place in Grenoble, France, in April 2014, as part of the European Joint Conferences on Theory and Practice of Software, ETAPS 2014. The total of 42 papers included in this volume, consisting of 26 research papers, 3 case study papers, 6 regular tool papers and 7 tool demonstrations papers, were carefully reviewed and selected from 161 submissions. In addition the book contains one invited contribution. The papers are organized in topical sections named: decision procedures and their application in analysis; complexity and termination analysis; modeling and model checking discrete systems; timed and hybrid systems; monitoring, fault detection and identification; competition on software verification; specifying and checking linear time properties; synthesis and learning; quantum and probabilistic systems; as well as tool demonstrations and case studies.
A foreword is usually prepared by someone who knows the author or who knows enough to provide additional insight on the purpose of the work. When asked to write this foreword, I had no problem with what I wanted to say about the work or the author. I did, however, wonder why people read a foreword. It is probably of value to know the background of the writer of a book; it is probably also of value to know the background of the individual who is commenting on the work. I consider myself a good friend of the author, and when I was asked to write a few words I felt honored to provide my view of Ray Prasad, his expertise, and the contribution that he has made to our industry. This book is about the industry, its technology, and its struggle to learn and compete in a global market bursting with new ideas to satisfy a voracious appetite for new and innovative electronic products. I had the good fortune to be there at the beginning (or almost) and have witnessed the growth and excitement in the opportunities and challenges afforded the electronic industries' engineering and manufacturing talents. In a few years my involve ment will span half a century.
This book constitutes the thoroughly refereed post-conference proceedings of the 5th International Workshop, COSADE 2014, held in Paris, France, in April 2014. The 20 revised full papers presented together with two invited talks were carefully selected from 51 submissions and collect truly existing results in cryptographic engineering, from concepts to artifacts, from software to hardware, from attack to countermeasure.
This book constitutes the thoroughly refereed post-conference proceedings of the 10th European Workshop, EuroPKI 2013, held in Egham, UK, in September 2013. The 11 revised full papers presented together with 1 invited talk were carefully selected from 20 submissions. The papers are organized in topical sections such as authorization and delegation, certificates management, cross certification, interoperability, key management, legal issues, long-time archiving, time stamping, trust management, trusted computing, ubiquitous scenarios and Web services security.
The last few years have seen a great increase in the amount of data available to scientists, yet many of the techniques used to analyse this data cannot cope with such large datasets. Therefore, strategies need to be employed as a pre-processing step to reduce the number of objects or measurements whilst retaining important information. Spectral dimensionality reduction is one such tool for the data processing pipeline. Numerous algorithms and improvements have been proposed for the purpose of performing spectral dimensionality reduction, yet there is still no gold standard technique. This book provides a survey and reference aimed at advanced undergraduate and postgraduate students as well as researchers, scientists, and engineers in a wide range of disciplines. Dimensionality reduction has proven useful in a wide range of problem domains and so this book will be applicable to anyone with a solid grounding in statistics and computer science seeking to apply spectral dimensionality to their work.
This book is dedicated to Prof. Dr. Heinz Gerhauser on the occasion of his retirement both from the position of Executive Director of the Fraunhofer Institute for Integrated Circuits IIS and from the Endowed Chair of Information Technologies with a Focus on Communication Electronics (LIKE) at the Friedrich-Alexander-Universitat Erlangen-Nurnberg. Heinz Gerhauser's vision and entrepreneurial spirit have made the Fraunhofer IIS one of the most successful and renowned German research institutions. He has been Director of the Fraunhofer IIS since 1993, and under his leadership it has grown to become the largest of Germany's 60 Fraunhofer Institutes, a position it retains to this day, currently employing over 730 staff. Likely his most important scientific as well as application-related contribution was his pivotal role in the development of the mp3 format, which would later become a worldwide success. The contributions to this Festschrift were written by both Fraunhofer IIS staff and external project team members in appreciation of Prof. Dr. Gerhauser's lifetime academic achievements and his inspiring leadership at the Fraunhofer IIS. The papers reflect the broad spectrum of the institute's research activities and are grouped into sections on circuits, information systems, visual computing, and audio and multimedia. They provide academic and industrial researchers in fields like signal processing, sensor networks, microelectronics, and integrated circuits with an up-to-date overview of research results that have a huge potential for cutting-edge industrial applications.
Among the group of physics honors students huddled in 1957 on a Colorado mountain watching Sputnik bisect the heavens, one young scientist was destined, three short years later, to become a key player in America s own top-secret spy satellite program. One of our era s most prolific mathematicians, Karl Gustafson was given just two weeks to write the first US spy satellite s software. The project would fundamentally alter America s Cold War strategy, and this autobiographical account of a remarkable academic life spent in the top flight tells this fascinating inside story for the first time. Gustafson takes you from his early pioneering work in computing, through fascinating encounters with Nobel laureates and Fields medalists, to his current observations on mathematics, science and life. He tells of brushes with death, being struck by lightning, and the beautiful women who have been a part of his journey." |
![]() ![]() You may like...
Royal Horticultural Society Desk Address…
Royal Horticultural Society
Hardcover
|