![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Physics > Classical mechanics > Sound, vibration & waves (acoustics)
This work is the first and only book on the fundamentals of ultrasonic machining. It presents the foundations of dynamic and control for ultrasonic processing systems and considers ultrasonic systems as special vibratory machines that function by exploiting nonlinear dynamic processes. Recommendations are given for designing and tuning ultrasonic machines. The ultrasonic machines analyzed are predominantly concerned with the processing of solids.
Computer Networks, Architecture and Applications covers many aspects of research in modern communications networks for computing purposes.
The Nonuniform Discrete Fourier Transform and its Applications in Signal Processing is organized into seven chapters. Chapter 1 introduces the problem of computing frequency samples of the z-transform of a finite-length sequence, and reviews the existing techniques. Chapter 2 develops the basics of the NDFT including its definition, properties and computational aspects. The NDFT is also extended to two dimensions. The ideas introduced here are utilized to develop applications of the NDFT in the following four chapters. Chapter 3 proposes a nonuniform frequency sampling technique for designing 1-D FIR digital filters. Design examples are presented for various types of filters. Chapter 4 utilizes the idea of the 2-D NDFT to design nonseparable 2-D FIR filters of various types. The resulting filters are compared with those designed by other existing methods and the performances of some of these filters are investigated by applying them to the decimation of digital images. Chapter 5 develops a design technique for synthesizing antenna patterns with nulls placed at desired angles to cancel interfering signals coming from these directions. Chapter 6 addresses the application of the NDFT in decoding dual-tone multi-frequency (DTMF) signals and presents an efficient decoding algorithm based on the subband NDFT (SB-NDFT), which achieves a fast, approximate computation of the NDFT. Concluding remarks are included in Chapter 7. The Nonuniform Discrete Fourier Transform and its Applications in Signal Processing serves as an excellent reference for researchers.
Volume II covers antenna theory and design, describing a number of antenna types, including receiving, wire and loop, horn, frequency-independent, microstrip, refelector, and lens antennas. This section also includes arrays, providing array theory as well as exploring waveguide-fed slot arrays, peiodic arrays, and aperiodic arrays.
The feasibility to extract porous medium parameters from acoustic
recordings is investigated. The thesis gives an excellent
discussion of our basic understanding of different wave modes,
using a full-waveform and multi-component approach. Focus lies on
the dependency on porosity and permeability where especially the
latter is difficult to estimate. In this thesis, this sensitivity
is shown for interface-wave and reflected-wave modes. For each of
the pseudo-Rayleigh and pseudo-Stoneley interface waves unique
estimates for permeability and porosity can be obtained when
impedance and attenuation are combined.
An exciting new development has taken place in the digital era that has captured the imagination and talent of researchers around the globe - wavelet image compression. This technology has deep roots in theories of vision, and promises performance improvements over all other compression methods, such as those based on Fourier transforms, vectors quantizers, fractals, neural nets, and many others. It is this revolutionary new technology that is presented in Wavelet Image and Video Compression, in a form that is accessible to the largest audience possible. Wavelet Image and Video Compression is divided into four parts. Part I, Background Material, introduces the basic mathematical structures that underly image compression algorithms with the intention of providing an easy introduction to the mathematical concepts that are prerequisites for the remainder of the book. It explains such topics as change of bases, scalar and vector quantization, bit allocation and rate-distortion theory, entropy coding, the discrete-cosine transform, wavelet filters and other related topics. Part II, Still Image Coding, presents a spectrum of wavelet still image coding techniques. Part III, Special Topics in Still Image Coding, provides a variety of example coding schemes with a special flavor in either approach or application domain. Part IV, Video Coding, examines wavelet and pyramidal coding techniques for video data. Wavelet Image and Video Compression serves as an excellent reference and may be used as a text for advanced courses covering the subject.
This book constitutes the first single-volume, English-language treatise on electromagnetic wave propagation across the frequency spectrum.
The first part aims at providing the physical and theoretical
framework of the analysis of density variations in fully turbulent
flows. Its scope is deliberately educational.
One of the most intriguing questions in image processing is the problem of recovering the desired or perfect image from a degraded version. In many instances one has the feeling that the degradations in the image are such that relevant information is close to being recognizable, if only the image could be sharpened just a little. This monograph discusses the two essential steps by which this can be achieved, namely the topics of image identification and restoration. More specifically the goal of image identifi cation is to estimate the properties of the imperfect imaging system (blur) from the observed degraded image, together with some (statistical) char acteristics of the noise and the original (uncorrupted) image. On the basis of these properties the image restoration process computes an estimate of the original image. Although there are many textbooks addressing the image identification and restoration problem in a general image processing setting, there are hardly any texts which give an indepth treatment of the state-of-the-art in this field. This monograph discusses iterative procedures for identifying and restoring images which have been degraded by a linear spatially invari ant blur and additive white observation noise. As opposed to non-iterative methods, iterative schemes are able to solve the image restoration problem when formulated as a constrained and spatially variant optimization prob In this way restoration results can be obtained which outperform the lem. results of conventional restoration filters."
This volume collects the papers from the 2013 World Conference on Acoustic Emission in Shanghai. The latest research and applications of Acoustic Emission (AE) are explored, with particular emphasis on detecting and processing of AE signals, development of AE instrument and testing standards, AE of materials, engineering structures and systems, including the processing of collected data and analytical techniques as well as experimental case studies.
The Fourier transform is one of the most important mathematical tools in a wide variety of science and engineering fields. Its application - as Fourier analysis or harmonic analysis - provides useful decompositions of signals into fundamental (primitive') components, giving shortcuts in the computation of complicated sums and integrals, and often revealing hidden structure in the data. Fourier Transforms: An Introduction for Engineers develops the basic definitions, properties and applications of Fourier analysis, the emphasis being on techniques for its application to linear systems, although other applications are also considered. The book will serve as both a reference text and a teaching text for a one-quarter or one-semester course covering the application of Fourier analysis to a wide variety of signals, including discrete time (or parameter), continuous time (or parameter), finite duration, and infinite duration. It highlights the common aspects in all cases considered, thereby building an intuition from simple examples that will be useful in the more complicated examples where careful proofs are not included. Fourier Analysis: An Introduction for Engineers is written by two scholars who are recognized throughout the world as leaders in this area, and provides a fresh look at one of the most important mathematical and directly applicable concepts in nearly all fields of science and engineering. Audience: Engineers, especially electrical engineers. The careful treatment of the fundamental mathematical ideas makes the book suitable in all areas where Fourier analysis finds applications.
In the last quarter century, delamination has come to mean more than just a failure in adhesion between layers of bonded composite plies that might affect their load-bearing capacity. Ever-increasing computer power has meant that we can now detect and analyze delamination between, for example, cell walls in solid wood. This fast-moving and critically important field of study is covered in a book that provides everyone from manufacturers to research scientists the state of the art in wood delamination studies. Divided into three sections, the book first details the general aspects of the subject, from basic information including terminology, to the theoretical basis for the evaluation of delamination. A settled terminology in this subject area is a first key goal of the book, as the terms which describe delamination in wood and wood-based composites are numerous and often confusing. The second section examines different and highly specialized methods for delamination detection such as confocal laser scanning microscopy, light microscopy, scanning electron microscopy and ultrasonics. Ways in which NDE (non-destructive evaluation) can be employed to detect and locate defects are also covered. The book's final section focuses on the practical aspects of this defect in a wide range of wood products covering the spectrum from trees, logs, laminated panels and glued laminated timbers to parquet floors. Intended as a primary reference, this book covers everything from the microscopic, anatomical level of delamination within solid wood sections to an examination of the interface of wood and its surface coatings. It provides readers with the perspective of industry as well as laboratory and is thus a highly practical sourcebook for wood engineers working in manufacturing as well as a comprehensively referenced text for materials scientists wrestling with the theory underlying the subject.
The book will appeal to both professional engineers and students and researchers in the subject. From an introduction to the basic terminology and underlying techniques, the book moves on to demonstrate the core enabling technologies, with a broad and balanced perspective given for each topic. Subsequent chapters focus on the applications and give an insight into the process of integrating a range of speech technologies for commercial solutions to customer needs. The book concludes with a speculative review of options for the future.
This book presents applications to several fluid dynamics problems in both the bounded and unbounded domains in the framework of the discrete velocity models of kinetic theory. The proposition of new models for dense gases, gases with multi-components, and gases with chemical reactions are also included. This is an up-to-date book on the applications of the discrete Boltzmann equation.
1.1. Steps in the initial auditory processing. 4 2 THE TIME-FREQUENCY ENERGY REPRESENTATION 2.1. Short-time spectrum of a steady-state Iii. 9 2.2. Smoothed short-time spectra. 9 2.3. Short-time spectra of linear chirps. 13 2.4. Short-time spectra of /w /'s. 15 2.5. Wide band spectrograms of /w /'s. 16 Spectrograms of rapid formant motion. 2.6. 17 2.7. Wigner distribution and spectrogram. 21 2.8. Wigner distribution and spectrogram of cos wot. 23 2.9. Concentration ellipses for transform kernels. 28 2.10. Concentration ellipses for complementary kernels. 42 42 2.11. Directional transforms for a linear chirp. 47 2.12. Spectrograms of /wioi/ with different window sizes. 2.13. Wigner distribution of /wioi/. 49 2.14. Time-frequency autocorrelation function of /wioi/. 49 2.15. Gaussian transform of Iwioi/. 50 2.16. Directional transforms of lwioi/. 52 3 TIME-FREQUENCY FILTERING 3.1. Recovering the transfer function by filtering. 57 3.2. Estimating 'aliased' transfer function. 61 3.3. T-F autocorrelation function of an impulse train. 70 3.4. T-F autocorrelation function of LTI filter output. 70 Windowing recovers transfer function. 3.5. 72 3.6. Shearing the time-frequency autocorrelation function. 75 3.7. T-F autocorrelation function for FM filter. 76 3.8. T-F autocorrelation function of FM filter output. 77 3.9. Windowing recovers transfer function. 79 4 THE SCHEMATIC SPECTROGRAM Problems with pole-fitting approach.
This book concerns a new method of image data compression which weil may supplant the well-established block-transfonn methods that have been state-of-the art for the last 15 years. Subband image coding or SBC was first perfonned as such in 1985, and as the results became known at first through conference proceedings, and later through journal papers, the research community became excited about both the theoretical and practical aspects of this new approach. This excitement is continuing today, with many major research laboratories and research universities around the world investigating the subband approach to coding of color images, high resolution images, video- including video conferencing and advanced tele vision, and the medical application of picture archiving systems. Much of the fruits of this work is summarized in the eight chapters of this book which were written by leading practitioners in this field. The subband approach to image coding starts by passing the image through a two- or three-dimensional filter bank. The two-dimensional (2-D) case usually is hierarchical' consisting of two stages of four filters each. Thus the original image is split into 16 subband images, with each one decimated or subsampled by 4x4, resulting in a data conservation. The individual channel data is then quantized *for digital transmission. In an attractive variation an octave-like approach, herein tenned subband pyramid, is taken for the decomposition resulting in a total of just eleven subbands.
This book provides an up-to-date introduction to the theory of sound propagation in the ocean. The text treats both ray and wave propagation and pays considerable attention to stochastic problems such as the scattering of sound at rough surfaces and random inhomogeneities. An introductory chapter that discusses the basic experimental data complements the following theoretical chapters. New material has been added throughout for this third edition. New topics covered include: - inter-thermocline lenses and their effect on sound fields- weakly divergent bundles of rays - ocean acoustic tomography - coupled modes - sound scattering by anisotropic volume inhomogeneities with fractal spectra - Voronovich's approach to sound scattering from the rough sea surface. In addition, the list of references has been brought up to date and the latest experimental data have been included.
Rate-Quality Optimized Video Coding discusses the matter of optimizing (or negotiating) the data rate of compressed digital video and its quality, which has been a relatively neglected topic in either side of image/video coding and tele-traffic management. Video rate management becomes a technically challenging task since it is required to maintain a certain video quality regardless of the availability of transmission or storage media. This is caused by the broadband nature of digital video and inherent algorithmic features of mainstream video compression schemes, e.g. H.261, H.263 and MPEG series. In order to maximize the media utilization and to enhance video quality, the data rate of compressed video should be regulated within a budget of available media resources while maintaining the video quality as high as possible. In Part I (Chapters 1 to 4) the non-stationarity of digital video is discussed. Since the non-stationary nature is also inherited from algorithmic properties of international video coding standards, which are a combination of statistical coding techniques, the video rate management techniques of these standards are explored. Although there is a series of known video rate control techniques, such as picture rate variation, frame dropping, etc., these techniques do not view the matter as an optimization between rate and quality. From the view of rate-quality optimization, the quantizer is the sole means of controling rate and quality. Thus, quantizers and quantizer control techniques are analyzed, based on the relationship of rate and quality. In Part II (Chapters 5 and 6), as a coherent approach to non-stationary video, established but still thriving nonlinear techniques are applied to video rate-quality optimization such as artificial neural networks including radical basis function networks, and fuzzy logic-based schemes. Conventional linear techniques are also described before the nonlinear techniques are explored. By using these nonlinear techniques, it is shown how they influence and tackle the rate-quality optimization problem. Finally, in Chapter 7 rate-quality optimization issues are reviewed in emerging video communication applications such as video transcoding and mobile video. This chapter discusses some new issues and prospects of rate and quality control in those technology areas. Rate-Quality Optimized Video Coding is an excellent reference and can be used for advanced courses on the topic.
In multimedia and communication environments all documents must be protected against attacks. The movie Forrest Gump showed how multimedia documents can be manipulated. The required security can be achieved by a number of different security measures. This book provides an overview of the current research in Multimedia and Communication Security. A broad variety of subjects are addressed including: network security; attacks; cryptographic techniques; healthcare and telemedicine; security infrastructures; payment systems; access control; models and policies; auditing and firewalls. This volume contains the selected proceedings of the joint conference on Communications and Multimedia Security; organized by the International Federation for Information processing and supported by the Austrian Computer Society, Gesellschaft fuer Informatik e.V. and TeleTrust Deutschland e.V. The conference took place in Essen, Germany, in September 1996
Speech Recognition has a long history of being one of the difficult problems in Artificial Intelligence and Computer Science. As one goes from problem solving tasks such as puzzles and chess to perceptual tasks such as speech and vision, the problem characteristics change dramatically: knowledge poor to knowledge rich; low data rates to high data rates; slow response time (minutes to hours) to instantaneous response time. These characteristics taken together increase the computational complexity of the problem by several orders of magnitude. Further, speech provides a challenging task domain which embodies many of the requirements of intelligent behavior: operate in real time; exploit vast amounts of knowledge, tolerate errorful, unexpected unknown input; use symbols and abstractions; communicate in natural language and learn from the environment. Voice input to computers offers a number of advantages. It provides a natural, fast, hands free, eyes free, location free input medium. However, there are many as yet unsolved problems that prevent routine use of speech as an input device by non-experts. These include cost, real time response, speaker independence, robustness to variations such as noise, microphone, speech rate and loudness, and the ability to handle non-grammatical speech. Satisfactory solutions to each of these problems can be expected within the next decade. Recognition of unrestricted spontaneous continuous speech appears unsolvable at present. However, by the addition of simple constraints, such as clarification dialog to resolve ambiguity, we believe it will be possible to develop systems capable of accepting very large vocabulary continuous speechdictation.
Client/Server applications are of increasing importance in industry, and have been improved by advanced distributed object-oriented techniques, dedicated tool support and both multimedia and mobile computing extensions. Recent responses to this trend are standardized distributed platforms and models including the Distributed Computing Environment (DCE) of the Open Software Foundation (OS F), Open Distributed Processing (ODP), and the Common Object Request Broker Architecture (CORBA) of the Object Management Group (OMG). These proceedings are the compilation of papers from the technical stream of the IFIPIIEEE International Conference on Distributed Platforms, Dresden, Germany. This conference has been sponsored by IFIP TC6.1, by the IEEE Communications Society, and by the German Association of Computer Science (GI -Gesellschaft fur Informatik). ICDP'96 was organized jointly by Dresden University of Technology and Aachen University of Technology. It is closely related to the International Workshop on OSF DCE in Karlsruhe, 1993, and to the IFIP International Conference on Open Distributed Processing. ICDP has been designed to bring together researchers and practitioners who are studying and developing new methodologies, tools and technologies for advanced client/server environ ments, distributed systems, and network applications based on distributed platforms."
Mobile computing is one of the biggest issues of computer technology, science and industry today. This book looks at the requirements of developing mobile computing systems and the challenges they pose to computer designers. It examines the requirements of mobile computing hardware, infrastructure and communications services. Information security and the data protection aspects of design are considered, together with telecommunications facilities for linking up to the worldwide computer infrastructure. The book also considers the mobility of computer users versus the portability of the equipment. The text also examines current applications of mobile computing in the public sector and future innovative applications.
This proceedings present the results of the 29th International Symposium on Shock Waves (ISSW29) which was held in Madison, Wisconsin, U.S.A., from July 14 to July 19, 2013. It was organized by the Wisconsin Shock Tube Laboratory, which is part of the College of Engineering of the University of Wisconsin-Madison. The ISSW29 focused on the following areas: Blast Waves, Chemically Reactive Flows, Detonation and Combustion, Facilities, Flow Visualization, Hypersonic Flow, Ignition, Impact and Compaction, Industrial Applications, Magnetohydrodynamics, Medical and Biological Applications, Nozzle Flow, Numerical Methods, Plasmas, Propulsion, Richtmyer-Meshkov Instability, Shock-Boundary Layer Interaction, Shock Propagation and Reflection, Shock Vortex Interaction, Shock Waves in Condensed Matter, Shock Waves in Multiphase Flow, as well as Shock Waves in Rarefield Flow. The two Volumes contain the papers presented at the symposium and serve as a reference for the participants of the ISSW 29 and individuals interested in these fields.
Most fluid flows of practical importance are fully three-dimensional, so the non-linear instability properties of three-dimensional flows are of particular interest. In some cases the three-dimensionality may have been caused by a finite amplitude disturbance whilst, more usually, the unperturbed state is three-dimensional. Practical applications where transition is thought to be associated with non-linearity in a three- dimensional flow arise, for example, in aerodynamics (swept wings, engine nacelles, etc.), turbines and aortic blood flow. Here inviscid cross-flow' disturbances as well as Tollmien-Schlichting and GArtler vortices can all occur simultaneously and their mutual non-linear behaviour must be understood if transition is to be predicted. The non-linear interactions are so complex that usually fully numerical or combined asymptotic/numerical methods must be used. Moreover, in view of the complexity of the instability processes, there is also a growing need for detailed and accurate experimental information. Carefully conducted tests allow us to identify those elements of a particular problem which are dominant. This assists in both the formulation of a relevant theoretical problem and the subsequent physical validation of predictions. It should be noted that the demands made upon the skills of the experimentalist are high and that the tests can be extremely sophisticated - often making use of the latest developments in flow diagnostic techniques, automated high speed data gathering, data analysis, fast processing and presentation. |
You may like...
The Oxford Book of Choral Music by Black…
Marques L. A. Garrett
Sheet music
R934
Discovery Miles 9 340
Internal Combustion Engines…
Institution of Mechanical Engineers
Paperback
R4,908
Discovery Miles 49 080
Artificial Intelligence for Signal…
Abhinav Sharma, Arpit Jain, …
Hardcover
R4,233
Discovery Miles 42 330
|