![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > Computer software packages
This book presents selected peer-reviewed contributions from the International Work-Conference on Time Series, ITISE 2017, held in Granada, Spain, September 18-20, 2017. It discusses topics in time series analysis and forecasting, including advanced mathematical methodology, computational intelligence methods for time series, dimensionality reduction and similarity measures, econometric models, energy time series forecasting, forecasting in real problems, online learning in time series as well as high-dimensional and complex/big data time series. The series of ITISE conferences provides a forum for scientists, engineers, educators and students to discuss the latest ideas and implementations in the foundations, theory, models and applications in the field of time series analysis and forecasting. It focuses on interdisciplinary and multidisciplinary research encompassing computer science, mathematics, statistics and econometrics.
Adaptive Technologies and Business Integration: Social, Managerial and Organizational Dimensions provides an authoritative review of both intra-organizational and inter-organizational aspects in business integration, including: managerial and organizational integration, social integration, and technology integration, along with the resources to accomplish this competitive advantage. This Premier Reference Source contains the most comprehensive knowledge on business integration. It provides an all-encompassing perspective on the importance of business integration in the emerging networked, extended, and collaborative organizational models. The innovative research contained in this reference work make it an essential addition to every library.
Soft computing, as an engineering science, and statistics, as a
classical branch of mathematics, emphasize different aspects of
data analysis.
The work presented in this book focuses on modeling audiovisual quality as perceived by the users of IP-based solutions for video communication like videotelephony. It also extends the current framework for the parametric prediction of audiovisual call quality. The book addresses several aspects related to the quality perception of entire video calls, namely, the quality estimation of the single audio and video modalities in an interactive context, the audiovisual quality integration of these modalities and the temporal pooling of short sample-based quality scores to account for the perceptual quality impact of time-varying degradations.
"We live in the age of data. In the last few years, the methodology of extracting insights from data or "data science" has emerged as a discipline in its own right. The R programming language has become one-stop solution for all types of data analysis. The growing popularity of R is due its statistical roots and a vast open source package library. The goal of "Beginning Data Science with R" is to introduce the readers to some of the useful data science techniques and their implementation with the R programming language. The book attempts to strike a balance between the how: specific processes and methodologies, and understanding the why: going over the intuition behind how a particular technique works, so that the reader can apply it to the problem at hand. This book will be useful for readers who are not familiar with statistics and the R programming language.
The theme of CUTE is focused on the various aspects of ubiquitous computing for advances in ubiquitous computing and provides an opportunity for academic and industry professionals to discuss the latest issues and progress in the area of ubiquitous computing. Therefore this book will be include the various theories and practical applications in ubiquitous computing
SQL Server 2008 is the latest update to Microsoft's flagship database management system. This is the largest update since SQL Server 2005, and it brings increased ability to deliver data across more platforms, and thus many different types of devices. New functionality also allows for easy storage and retrieval of digitized images and video. These attributes address the recent explosion in the popularity of web-based video and server and desktop virtualization. The Real MCTS SQL Server 2008 Exam 70-433 Prep Kit prepares readers for the Microsoft Certified Technology Specialist exam: SQL Server 2008, Database Development. This is a new exam in the SQL Server product family, and is comprised of some objectives from exam 70-431 from SQL Server 2005, and covers new, expanded query capabilities in SQL Server 2008. According to Microsoft: Exam 70-431 for SQL Server 2005 was passed by over 35,000 people. 150,000 people passed a similar exam for SQL Server 2000. Additionally, this exam is a pre-requisite for those going on to obtain the MCITP: Database Developer 2008: Successful candidates for the SQL Server 2005 MCITP 2,500. The Prep Kit offers practice exams via the book's companion
site.
This book highlights recent developments in the field, presented at the Social Simulation 2015 conference in Groningen, The Netherlands. It covers advances both in applications and methods of social simulation. Societal issues addressed range across complexities in economic systems, opinion dynamics and civil violence, changing mobility patterns, different land-use, transition in the energy system, food production and consumption, ecosystem management and historical processes. Methodological developments cover how to use empirical data in validating models in general, formalization of behavioral theory in agent behavior, construction of artificial populations for experimentation, replication of models, and agent-based models that can be run in a web browser. Social simulation is a rapidly evolving field. Social scientists are increasingly interested in social simulation as a tool to tackle the complex non-linear dynamics of society. Furthermore, the software and hardware tools available for social simulation are becoming more and more powerful. This book is an important source for readers interested in the newest developments in the ways in which the simulation of social interaction contributes to our understanding and managing of complex social phenomena.
Let's try to play the music and not the background. Ornette Coleman, liner notes of the LP "Free Jazz" 20] WhenIbegantocreateacourseonfreejazz, theriskofsuchanenterprise was immediately apparent: I knew that Cecil Taylor had failed to teach such a matter, and that for other, more academic instructors, the topic was still a sort of outlandish adventure. To be clear, we are not talking about tea- ing improvisation here-a di?erent, and also problematic, matter-rather, we wish to create a scholarly discourse about free jazz as a cultural achievement, and follow its genealogy from the American jazz tradition through its various outbranchings, suchastheEuropeanandJapanesejazzconceptionsandint- pretations. We also wish to discuss some of the underlying mechanisms that are extant in free improvisation, things that could be called technical aspects. Such a discourse bears the ?avor of a contradicto in adjecto: Teachingthe unteachable, the very negation of rules, above all those posited by white jazz theorists, and talking about the making of sounds without aiming at so-called factual results and all those intellectual sedimentations: is this not a suicidal topic? My own endeavors as a free jazz pianist have informed and advanced my conviction that this art has never been theorized in a satisfactory way, not even by Ekkehard Jost in his unequaled, phenomenologically precise p- neering book "Free Jazz" 57].
This volume presents the latest advances and trends in nonparametric statistics, and gathers selected and peer-reviewed contributions from the 3rd Conference of the International Society for Nonparametric Statistics (ISNPS), held in Avignon, France on June 11-16, 2016. It covers a broad range of nonparametric statistical methods, from density estimation, survey sampling, resampling methods, kernel methods and extreme values, to statistical learning and classification, both in the standard i.i.d. case and for dependent data, including big data. The International Society for Nonparametric Statistics is uniquely global, and its international conferences are intended to foster the exchange of ideas and the latest advances among researchers from around the world, in cooperation with established statistical societies such as the Institute of Mathematical Statistics, the Bernoulli Society and the International Statistical Institute. The 3rd ISNPS conference in Avignon attracted more than 400 researchers from around the globe, and contributed to the further development and dissemination of nonparametric statistics knowledge.
Scientific Data Analysis using Jython Scripting and Java presents practical approaches for data analysis using Java scripting based on Jython, a Java implementation of the Python language. The chapters essentially cover all aspects of data analysis, from arrays and histograms to clustering analysis, curve fitting, metadata and neural networks. A comprehensive coverage of data visualisation tools implemented in Java is also included. Written by the primary developer of the jHepWork data-analysis framework, the book provides a reliable and complete reference source laying the foundation for data-analysis applications using Java scripting. More than 250 code snippets (of around 10-20 lines each) written in Jython and Java, plus several real-life examples help the reader develop a genuine feeling for data analysis techniques and their programming implementation. This is the first data-analysis and data-mining book which is completely based on the Jython language, and opens doors to scripting using a fully multi-platform and multi-threaded approach. Graduate students and researchers will benefit from the information presented in this book.
This is an edited volume, written by well-recognized international researchers with extended chapter style versions of the best papers presented at the SITIS 2006 International Conference. This book presents the state-of-the-art and recent research results on the application of advanced signal processing techniques for improving the value of image and video data. It introduces new results on video coding on time-honored topic of securing image information. The book is designed for a professional audience composed of practitioners and researchers in industry. This book is also suitable for advanced-level students in computer science.
Although recognized as a key to the design process, prototyping
often falls victim to budget cuts, deadlines, or lack of access to
sophisticated tools. This can lead to sloppy and ineffective
prototypes or the abandonment of them altogether. Rather than lose
this important step, people are turning to Microsoft Excel(r) to
create effective, simple, and inexpensive prototypes. Conveniently,
the software is available to nearly everyone, and most are
proficient in its basic functionality.
This book presents the R software environment as a key tool for oceanographic computations and provides a rationale for using R over the more widely-used tools of the field such as MATLAB. Kelley provides a general introduction to R before introducing the 'oce' package. This package greatly simplifies oceanographic analysis by handling the details of discipline-specific file formats, calculations, and plots. Designed for real-world application and developed with open-source protocols, oce supports a broad range of practical work. Generic functions take care of general operations such as subsetting and plotting data, while specialized functions address more specific tasks such as tidal decomposition, hydrographic analysis, and ADCP coordinate transformation. In addition, the package makes it easy to document work, because its functions automatically update processing logs stored within its data objects. Kelley teaches key R functions using classic examples from the history of oceanography, specifically the work of Alfred Redfield, Gordon Riley, J. Tuzo Wilson, and Walter Munk. Acknowledging the pervasive popularity of MATLAB, the book provides advice to users who would like to switch to R. Including a suite of real-life applications and over 100 exercises and solutions, the treatment is ideal for oceanographers, technicians, and students who want to add R to their list of tools for oceanographic analysis.
Disaster management is a process or strategy that is implemented when any type of catastrophic event takes place. The process may be initiated when anything threatens to disrupt normal operations or puts the lives of human beings at risk. Governments on all levels as well as many businesses create some sort of disaster plan that make it possible to overcome the catastrophe and return to normal function as quickly as possible. Response to natural disasters (e.g., floods, earthquakes) or technological disaster (e.g., nuclear, chemical) is an extreme complex process that involves severe time pressure, various uncertainties, high non-linearity and many stakeholders. Disaster management often requires several autonomous agencies to collaboratively mitigate, prepare, respond, and recover from heterogeneous and dynamic sets of hazards to society. Almost all disasters involve high degrees of novelty to deal with most unexpected various uncertainties and dynamic time pressures. Existing studies and approaches within disaster management have mainly been focused on some specific type of disasters with certain agency oriented. There is a lack of a general framework to deal with similarities and synergies among different disasters by taking their specific features into account. This book provides with various decisions analysis theories and support tools in complex systems in general and in disaster management in particular. The book is also generated during a long-term preparation of a European project proposal among most leading experts in the areas related to the book title. Chapters are evaluated based on quality and originality in theory and methodology, application oriented, relevance to the title of the book.
Throughout the world, high-profile large organizations (aerospace
and defense, automotive, banking, chemicals, financial service
providers, healthcare, high tech, insurance, oil and gas,
pharmaceuticals, retail, telecommunications, and utilities) and
governments are using SAP software to process their most
mission-critical, highly sensitive data. With more than 100,000
installations, SAP is the world's largest enterprise software
company and the world's third largest independent software supplier
overall.
This book provides a complete and comprehensive guide to Pyomo (Python Optimization Modeling Objects) for beginning and advanced modelers, including students at the undergraduate and graduate levels, academic researchers, and practitioners. Using many examples to illustrate the different techniques useful for formulating models, this text beautifully elucidates the breadth of modeling capabilities that are supported by Pyomo and its handling of complex real-world applications. In the third edition, much of the material has been reorganized, new examples have been added, and a new chapter has been added describing how modelers can improve the performance of their models. The authors have also modified their recommended method for importing Pyomo. A big change in this edition is the emphasis of concrete models, which provide fewer restrictions on the specification and use of Pyomo models. Pyomo is an open source software package for formulating and solving large-scale optimization problems. The software extends the modeling approach supported by modern AML (Algebraic Modeling Language) tools. Pyomo is a flexible, extensible, and portable AML that is embedded in Python, a full-featured scripting language. Python is a powerful and dynamic programming language that has a very clear, readable syntax and intuitive object orientation. Pyomo includes Python classes for defining sparse sets, parameters, and variables, which can be used to formulate algebraic expressions that define objectives and constraints. Moreover, Pyomo can be used from a command-line interface and within Python's interactive command environment, which makes it easy to create Pyomo models, apply a variety of optimizers, and examine solutions.
This book examines air pollution of a big city using multi-year and multi-season data from ground-based air monitoring stations and satellite sounding data, which provides more clear and detailed information on the main sources of air pollution, the long-term trend of pollution, the influence of meteorological parameters on pollution levels, and trajectories of polluted air masses. For example, the book shows that particulate matter from local sources is transported from deserts to create air quality challenges. It also analyzes the effects of desert and semi-desert landscapes on high concentrations of pollutants.
This book presents the textile-, mathematical and mechanical background for the modelling of fiber based structures such as yarns, braided and knitted textiles. The hierarchical scales of these textiles and the structural elements at the different levels are analysed and the methods for their modelling are presented. The author reports about problems, methods and algorithms and possible solutions from his twenty year experience in the modelling and software development of CAD for textiles.
Over the last decades Discrete Event Simulation has conquered many different application areas. This trend is, on the one hand, driven by an ever wider use of this technology in different fields of science and on the other hand by an incredibly creative use of available software programs through dedicated experts. This book contains articles from scientists and experts from 10 countries. They illuminate the width of application of this technology and the quality of problems solved using Discrete Event Simulation. Practical applications of simulation dominate in the present book. The book is aimed to researchers and students who deal in their work with Discrete Event Simulation and which want to inform them about current applications. By focusing on discrete event simulation, this book can also serve as an inspiration source for practitioners for solving specific problems during their work. Decision makers who deal with the question of the introduction of discrete event simulation for planning support and optimization this book provides a contribution to the orientation, what specific problems could be solved with the help of Discrete Event Simulation within the organization.
Systems for Online Transaction Processing (OLTP) and Online Analytical Processing (OLAP) are currently separate. The potential of the latest technologies and changes in operational and analytical applications over the last decade have given rise to the unification of these systems, which can be of benefit for both workloads. Research and industry have reacted and prototypes of hybrid database systems are now appearing. Benchmarks are the standard method for evaluating, comparing and supporting the development of new database systems. Because of the separation of OLTP and OLAP systems, existing benchmarks are only focused on one or the other. With the rise of hybrid database systems, benchmarks to assess these systems will be needed as well. Based on the examination of existing benchmarks, a new benchmark for hybrid database systems is introduced in this book. It is furthermore used to determine the effect of adding OLAP to an OLTP workload and is applied to analyze the impact of typically used optimizations in the historically separate OLTP and OLAP domains in mixed-workload scenarios.
The methodological system known as The NuneX Method, so named after its developer, Richard Nunez, took over 14 years of experience, documentation, and experimenting to develop and refine into a workable documentation system. This system can handle the influx of progress and change within information technology and be utilized as a form of technical knowledge management. The main objectives for this methodology are for any technical professional to properly document a project, system implementation, work request, or repair, and maintain a personal library of their own for reference and professional growth. It can even serve as a gauge to measure the success an IT professional achieves as one improves and becomes more aware and open to new ideas and techniques. Use of The NuneX Method can certainly contribute to an IT professional's own personal success story and be a tool to utilize anytime and anywhere. It was developed by an IT professional for IT professionals, namely those who work in the technical areas within Information Technology. techniques for IT professionals who work in technical and engineering level positions. 6. Refinement; 7. Maintenance & Updating documentation and allow for better quality service and professional advancement within a technical career. complete with practical, real-world exercises to enhance the learning process. |
![]() ![]() You may like...
Essential Java for Scientists and…
Brian Hahn, Katherine Malan
Paperback
R1,341
Discovery Miles 13 410
29th European Symposium on Computer…
Anton A Kiss, Edwin Zondervan, …
Hardcover
R12,034
Discovery Miles 120 340
Fundamentals of Spatial Information…
Robert Laurini, Derek Thompson
Hardcover
R1,539
Discovery Miles 15 390
14th International Symposium on Process…
Yoshiyuki Yamashita, Manabu Kano
Hardcover
R11,801
Discovery Miles 118 010
Financial Analysis With Microsoft Excel
Timothy Mayes
Paperback
Auroboros: Coils of the Serpent…
Warchief Gaming, Chris Metzen
Hardcover
|