![]() |
![]() |
Your cart is empty |
||
Books > Science & Mathematics > Physics > Classical mechanics > Sound, vibration & waves (acoustics)
1.1. Steps in the initial auditory processing. 4 2 THE TIME-FREQUENCY ENERGY REPRESENTATION 2.1. Short-time spectrum of a steady-state Iii. 9 2.2. Smoothed short-time spectra. 9 2.3. Short-time spectra of linear chirps. 13 2.4. Short-time spectra of /w /'s. 15 2.5. Wide band spectrograms of /w /'s. 16 Spectrograms of rapid formant motion. 2.6. 17 2.7. Wigner distribution and spectrogram. 21 2.8. Wigner distribution and spectrogram of cos wot. 23 2.9. Concentration ellipses for transform kernels. 28 2.10. Concentration ellipses for complementary kernels. 42 42 2.11. Directional transforms for a linear chirp. 47 2.12. Spectrograms of /wioi/ with different window sizes. 2.13. Wigner distribution of /wioi/. 49 2.14. Time-frequency autocorrelation function of /wioi/. 49 2.15. Gaussian transform of Iwioi/. 50 2.16. Directional transforms of lwioi/. 52 3 TIME-FREQUENCY FILTERING 3.1. Recovering the transfer function by filtering. 57 3.2. Estimating 'aliased' transfer function. 61 3.3. T-F autocorrelation function of an impulse train. 70 3.4. T-F autocorrelation function of LTI filter output. 70 Windowing recovers transfer function. 3.5. 72 3.6. Shearing the time-frequency autocorrelation function. 75 3.7. T-F autocorrelation function for FM filter. 76 3.8. T-F autocorrelation function of FM filter output. 77 3.9. Windowing recovers transfer function. 79 4 THE SCHEMATIC SPECTROGRAM Problems with pole-fitting approach.
This book provides an up-to-date introduction to the theory of sound propagation in the ocean. The text treats both ray and wave propagation and pays considerable attention to stochastic problems such as the scattering of sound at rough surfaces and random inhomogeneities. An introductory chapter that discusses the basic experimental data complements the following theoretical chapters. New material has been added throughout for this third edition. New topics covered include: - inter-thermocline lenses and their effect on sound fields- weakly divergent bundles of rays - ocean acoustic tomography - coupled modes - sound scattering by anisotropic volume inhomogeneities with fractal spectra - Voronovich's approach to sound scattering from the rough sea surface. In addition, the list of references has been brought up to date and the latest experimental data have been included.
This book concerns a new method of image data compression which weil may supplant the well-established block-transfonn methods that have been state-of-the art for the last 15 years. Subband image coding or SBC was first perfonned as such in 1985, and as the results became known at first through conference proceedings, and later through journal papers, the research community became excited about both the theoretical and practical aspects of this new approach. This excitement is continuing today, with many major research laboratories and research universities around the world investigating the subband approach to coding of color images, high resolution images, video- including video conferencing and advanced tele vision, and the medical application of picture archiving systems. Much of the fruits of this work is summarized in the eight chapters of this book which were written by leading practitioners in this field. The subband approach to image coding starts by passing the image through a two- or three-dimensional filter bank. The two-dimensional (2-D) case usually is hierarchical' consisting of two stages of four filters each. Thus the original image is split into 16 subband images, with each one decimated or subsampled by 4x4, resulting in a data conservation. The individual channel data is then quantized *for digital transmission. In an attractive variation an octave-like approach, herein tenned subband pyramid, is taken for the decomposition resulting in a total of just eleven subbands.
This proceedings present the results of the 29th International Symposium on Shock Waves (ISSW29) which was held in Madison, Wisconsin, U.S.A., from July 14 to July 19, 2013. It was organized by the Wisconsin Shock Tube Laboratory, which is part of the College of Engineering of the University of Wisconsin-Madison. The ISSW29 focused on the following areas: Blast Waves, Chemically Reactive Flows, Detonation and Combustion, Facilities, Flow Visualization, Hypersonic Flow, Ignition, Impact and Compaction, Industrial Applications, Magnetohydrodynamics, Medical and Biological Applications, Nozzle Flow, Numerical Methods, Plasmas, Propulsion, Richtmyer-Meshkov Instability, Shock-Boundary Layer Interaction, Shock Propagation and Reflection, Shock Vortex Interaction, Shock Waves in Condensed Matter, Shock Waves in Multiphase Flow, as well as Shock Waves in Rarefield Flow. The two Volumes contain the papers presented at the symposium and serve as a reference for the participants of the ISSW 29 and individuals interested in these fields.
This book presents all aspects of situational awareness using acoustic signals. It starts by presenting the science behind understanding and interpretation of sound signals. The book then goes on to provide various signal processing techniques used in acoustics to find the direction of sound source, localize gunfire, track vehicles and detect people. The necessary mathematical background and various classification and fusion techniques are presented. The book contains majority of the things one would need to process acoustic signals for all aspects of situational awareness in one location. The book also presents array theory, which is pivotal in finding the direction of arrival of acoustic signals. In addition, the book presents techniques to fuse the information from multiple homogeneous/heterogeneous sensors for better detection. MATLAB code is provided for majority of the real application, which is a valuable resource in not only understanding the theory but readers can also use the code as a spring-board to develop their own application based software code.
Rate-Quality Optimized Video Coding discusses the matter of optimizing (or negotiating) the data rate of compressed digital video and its quality, which has been a relatively neglected topic in either side of image/video coding and tele-traffic management. Video rate management becomes a technically challenging task since it is required to maintain a certain video quality regardless of the availability of transmission or storage media. This is caused by the broadband nature of digital video and inherent algorithmic features of mainstream video compression schemes, e.g. H.261, H.263 and MPEG series. In order to maximize the media utilization and to enhance video quality, the data rate of compressed video should be regulated within a budget of available media resources while maintaining the video quality as high as possible. In Part I (Chapters 1 to 4) the non-stationarity of digital video is discussed. Since the non-stationary nature is also inherited from algorithmic properties of international video coding standards, which are a combination of statistical coding techniques, the video rate management techniques of these standards are explored. Although there is a series of known video rate control techniques, such as picture rate variation, frame dropping, etc., these techniques do not view the matter as an optimization between rate and quality. From the view of rate-quality optimization, the quantizer is the sole means of controling rate and quality. Thus, quantizers and quantizer control techniques are analyzed, based on the relationship of rate and quality. In Part II (Chapters 5 and 6), as a coherent approach to non-stationary video, established but still thriving nonlinear techniques are applied to video rate-quality optimization such as artificial neural networks including radical basis function networks, and fuzzy logic-based schemes. Conventional linear techniques are also described before the nonlinear techniques are explored. By using these nonlinear techniques, it is shown how they influence and tackle the rate-quality optimization problem. Finally, in Chapter 7 rate-quality optimization issues are reviewed in emerging video communication applications such as video transcoding and mobile video. This chapter discusses some new issues and prospects of rate and quality control in those technology areas. Rate-Quality Optimized Video Coding is an excellent reference and can be used for advanced courses on the topic.
Mobile computing is one of the biggest issues of computer technology, science and industry today. This book looks at the requirements of developing mobile computing systems and the challenges they pose to computer designers. It examines the requirements of mobile computing hardware, infrastructure and communications services. Information security and the data protection aspects of design are considered, together with telecommunications facilities for linking up to the worldwide computer infrastructure. The book also considers the mobility of computer users versus the portability of the equipment. The text also examines current applications of mobile computing in the public sector and future innovative applications.
Speech Recognition has a long history of being one of the difficult problems in Artificial Intelligence and Computer Science. As one goes from problem solving tasks such as puzzles and chess to perceptual tasks such as speech and vision, the problem characteristics change dramatically: knowledge poor to knowledge rich; low data rates to high data rates; slow response time (minutes to hours) to instantaneous response time. These characteristics taken together increase the computational complexity of the problem by several orders of magnitude. Further, speech provides a challenging task domain which embodies many of the requirements of intelligent behavior: operate in real time; exploit vast amounts of knowledge, tolerate errorful, unexpected unknown input; use symbols and abstractions; communicate in natural language and learn from the environment. Voice input to computers offers a number of advantages. It provides a natural, fast, hands free, eyes free, location free input medium. However, there are many as yet unsolved problems that prevent routine use of speech as an input device by non-experts. These include cost, real time response, speaker independence, robustness to variations such as noise, microphone, speech rate and loudness, and the ability to handle non-grammatical speech. Satisfactory solutions to each of these problems can be expected within the next decade. Recognition of unrestricted spontaneous continuous speech appears unsolvable at present. However, by the addition of simple constraints, such as clarification dialog to resolve ambiguity, we believe it will be possible to develop systems capable of accepting very large vocabulary continuous speechdictation.
In multimedia and communication environments all documents must be protected against attacks. The movie Forrest Gump showed how multimedia documents can be manipulated. The required security can be achieved by a number of different security measures. This book provides an overview of the current research in Multimedia and Communication Security. A broad variety of subjects are addressed including: network security; attacks; cryptographic techniques; healthcare and telemedicine; security infrastructures; payment systems; access control; models and policies; auditing and firewalls. This volume contains the selected proceedings of the joint conference on Communications and Multimedia Security; organized by the International Federation for Information processing and supported by the Austrian Computer Society, Gesellschaft fuer Informatik e.V. and TeleTrust Deutschland e.V. The conference took place in Essen, Germany, in September 1996
Speech coding has been an ongoing area of research for several decades, yet the level of activity and interest in this area has expanded dramatically in the last several years. Important advances in algorithmic techniques for speech coding have recently emerged and excellent progress has been achieved in producing high quality speech at bit rates as low as 4.8 kb/s. Although the complexity of the newer more sophisticated algorithms greatly exceeds that of older methods (such as ADPCM), today's powerful programmable signal processor chips allow rapid technology transfer from research to product development and permit many new cost-effective applications of speech coding. In particular, low bit rate voice technology is converging with the needs of the rapidly evolving digital telecom munication networks. The IEEE Workshop on Speech Coding for Telecommunications was held in Vancouver, British Columbia, Canada, from September 5 to 8, 1989. The objective of the workshop was to provide a forum for discussion of recent developments and future directions in speech coding. The workshop attracted over 130 researchers from several countries and its technical program included 51 papers."
Client/Server applications are of increasing importance in industry, and have been improved by advanced distributed object-oriented techniques, dedicated tool support and both multimedia and mobile computing extensions. Recent responses to this trend are standardized distributed platforms and models including the Distributed Computing Environment (DCE) of the Open Software Foundation (OS F), Open Distributed Processing (ODP), and the Common Object Request Broker Architecture (CORBA) of the Object Management Group (OMG). These proceedings are the compilation of papers from the technical stream of the IFIPIIEEE International Conference on Distributed Platforms, Dresden, Germany. This conference has been sponsored by IFIP TC6.1, by the IEEE Communications Society, and by the German Association of Computer Science (GI -Gesellschaft fur Informatik). ICDP'96 was organized jointly by Dresden University of Technology and Aachen University of Technology. It is closely related to the International Workshop on OSF DCE in Karlsruhe, 1993, and to the IFIP International Conference on Open Distributed Processing. ICDP has been designed to bring together researchers and practitioners who are studying and developing new methodologies, tools and technologies for advanced client/server environ ments, distributed systems, and network applications based on distributed platforms."
Acoustic and elastic wave propagation is being investigated in media such as the ocean, the earth, biological tissues and solid materials. In these different areas, many specific imaging techniques have been developed which differ in the wavelength of the sound, its polarisation and the instrumentation used. In this interdisciplinary book, leading experts in underwater acoustics, seismology, acoustic medical imaging and non-destructive testing present basic concepts as well as the recent advances in imaging. The different subjects tackled show significant similarities. This volume gives an up-to-date-overview of the field and is intended for scientists and graduates alike. Also available online in LINK:http://link.springer.de/series/tap/Access to table of contents and abstracts is free. Subscribers have access to the full text in PDF format when asking for a password.
The inverse scattering problem is central to many areas of science and technology such as radar and sonar, medical imaging, geophysical exploration and nondestructive testing. This book is devoted to the mathematical and numerical analysis of the inverse scattering problem for acoustic and electromagnetic waves. In this third edition, new sections have been added on the linear sampling and factorization methods for solving the inverse scattering problem as well as expanded treatments of iteration methods and uniqueness theorems for the inverse obstacle problem. These additions have in turn required an expanded presentation of both transmission eigenvalues and boundary integral equations in Sobolev spaces. As in the previous editions, emphasis has been given to simplicity over generality thus providing the reader with an accessible introduction to the field of inverse scattering theory. Review of earlier editions: "Colton and Kress have written a scholarly, state of the art account of their view of direct and inverse scattering. The book is a pleasure to read as a graduate text or to dip into at leisure. It suggests a number of open problems and will be a source of inspiration for many years to come." SIAM Review, September 1994 "This book should be on the desk of any researcher, any student, any teacher interested in scattering theory." Mathematical Intelligencer, June 1994"
Most fluid flows of practical importance are fully three-dimensional, so the non-linear instability properties of three-dimensional flows are of particular interest. In some cases the three-dimensionality may have been caused by a finite amplitude disturbance whilst, more usually, the unperturbed state is three-dimensional. Practical applications where transition is thought to be associated with non-linearity in a three- dimensional flow arise, for example, in aerodynamics (swept wings, engine nacelles, etc.), turbines and aortic blood flow. Here inviscid cross-flow' disturbances as well as Tollmien-Schlichting and GArtler vortices can all occur simultaneously and their mutual non-linear behaviour must be understood if transition is to be predicted. The non-linear interactions are so complex that usually fully numerical or combined asymptotic/numerical methods must be used. Moreover, in view of the complexity of the instability processes, there is also a growing need for detailed and accurate experimental information. Carefully conducted tests allow us to identify those elements of a particular problem which are dominant. This assists in both the formulation of a relevant theoretical problem and the subsequent physical validation of predictions. It should be noted that the demands made upon the skills of the experimentalist are high and that the tests can be extremely sophisticated - often making use of the latest developments in flow diagnostic techniques, automated high speed data gathering, data analysis, fast processing and presentation.
What is "digital telephony"? To the authors, the term digital telephony denotes the technology used to provide a completely digital telecommunication system from end-to-end. This implies the use of digital technology from one end instru ment through transmission facilities and switching centers to another end instru ment. Digital telephony has become possible only because of the recent and on going surge of semiconductor developments, allowing microminiaturization and high reliability along with reduced costs. This book deals with both the future and the present. Thus, the first chapter is entitled, "A Network in Transition." As baselines, Chapters 2 and 11 provide the reader with the present status of teler-hone technology in terms of voice digiti zation as well as switching principles. The book is an outgrowth of the authors' consulting and teaching experience in the field since the early 1980s. The book has been written to provide both the engineering student and the practicing engineer a working knowledge of the prin ciples of present and future telecommunication systems based upon the use of the public switched network. Problems or discussion questions have been included at the ends of the chapters to facilitate the book's use as a senior-level or first year graduate-level course text. Numerous clients and associates of the authors as well as hundreds of others have provided useful information and examples for the text, and the authors wish to thank all those who have so contributed either directly or indirectly."
This book provides a comprehensive presentation of the conceptual basis of wavelet analysis, including the construction and analysis of wavelet bases. It motivates the central ideas of wavelet theory by offering a detailed exposition of the Haar series, then shows how a more abstract approach allows readers to generalize and improve upon the Haar series. It then presents a number of variations and extensions of Haar construction.
This book puts the focus on serving human listeners in the sound field synthesis although the approach can be also exploited in other applications such as underwater acoustics or ultrasonics. The author derives a fundamental formulation based on standard integral equations and the single-layer potential approach is identified as a useful tool in order to derive a general solution. He also proposes extensions to the single-layer potential approach which allow for a derivation of explicit solutions for circular, planar, and linear distributions of secondary sources. Based on above described formulation it is shown that the two established analytical approaches of Wave Field Synthesis and Near-field Compensated Higher Order Ambisonics constitute specific solutions to the general problem which are covered by the single-layer potential solution and its extensions.
Coding and Modulation for Digital Television presents a comprehensive description of all error control coding and digital modulation techniques used in Digital Television (DTV). This book illustrates the relevant elements from the expansive theory of channel coding to how the transmission environment dictates the choice of error control coding and digital modulation schemes. These elements are presented in such a way that both the mathematical integrity' and understanding for engineers' are combined in a complete form and supported by a number of practical examples. In addition, the book contains descriptions of the existing standards and provides a valuable source of corresponding references. Coding and Modulation for Digital Television also features a description of the latest techniques, providing the reader with a glimpse of future digital broadcasting. These include the concepts of soft-in-soft-out decoding, turbo-coding and cross-correlated quadrature modulation, all of which will have a prominent future in improving efficiency of the next generation DTV systems. Coding and Modulation for Digital Television is essential reading for all undergraduate and postgraduate students, broadcasting and communication engineers, researchers, marketing managers, regulatory bodies, governmental organizations and standardization institutions of the digital television industry.
This book describes the physics of the second-generation quartz crystal microbalance (QCM), a fundamental method of analysis for soft matter at interfaces. From a device for measuring film thickness in vacuum, the quartz crystal microbalance (QCM) has in the past two decades evolved into a versatile instrument for analyzing soft matter at solid/liquid and solid/gas interfaces that found applications in diverse fields including the life sciences, material science, polymer research and electrochemistry. As a consequence of this success, the QCM is now being used by scientists with a wide variety of backgrounds to study an impressive diversity of samples, with intricate data analysis methods being elaborated along the way. It is for these practitioners of the QCM that the book is written. It brings across basic principles behind the technique and the data analysis methods in sufficient detail to be educational and in a format that is accessible to anyone with an undergraduate level knowledge of any of the physical or natural sciences. These principles concern the analysis of acoustic shear waves and build on a number of fundamental physical concepts which many users of the technique do not usually come across. They have counterparts in optical spectroscopy, electrical engineering, quantum mechanics, rheology and mechanics, making this book a useful educational resource beyond the QCM itself. The main focus is the physics of QCM, but as the book describes the behavior of the QCM when exposed to films, droplets, polymer brushes, particles, vesicles, nanobubbles and stick-slip, it also offers insight into the behavior of soft matter at interfaces in a more general sense.
Welcome to the fourth IFIP workshop on protocols for high speed networks in Vancouver. This workshop follows three very successful workshops held in Ziirich (1989), Palo Alto (1990) and Stockholm (1993) respectively. We received a large number of papers in response to our call for contributions. This year, forty papers were received of which sixteen were presented as full papers and four were presented as poster papers. Although we received many excellent papers the program committee decided to keep the number of full presentations low in order to accommodate more discussion in keeping with the format of a workshop. Many people have contributed to the success of this workshop including the members of the program committee who, with the additional reviewers, helped make the selection of the papers. We are thankful to all the authors of the papers that were submitted. We also thank several organizations which have contributed financially to this workshop, specially NSERC, ASI, CICSR, UBC, MPR Teltech and Newbridge Networks.
The use of various types of wave energy is an increasingly promising, non-destructive means of detecting objects and of diagnosing the properties of quite complicated materials. An analysis of this technique requires an understanding of how waves evolve in the medium of interest and how they are scattered by inhomogeneities in the medium. These scattering phenomena can be thought of as arising from some perturbation of a given, known system and they are analysed by developing a scattering theory. This monograph provides an introductory account of scattering phenomena and a guide to the technical requirements for investigating wave scattering problems. It gathers together the principal mathematical topics which are required when dealing with wave propagation and scattering problems, and indicates how to use the material to develop the required solutions. Both potential and target scattering phenomena are investigated and extensions of the theory to the electromagnetic and elastic fields are provided. Throughout, the emphasis is on concepts and results rather than on the fine detail of proof; a bibliography at the end of each chapter points the interested reader to more detailed proofs of the theorems and suggests directions for further reading.Aimed at graduate and postgraduate students and researchers in mathematics and the applied sciences, this book aims to provide the newcomer to the field with a unified, and reasonably self-contained, introduction to an exciting research area and, for the more experienced reader, a source of information and techniques.
Waves represent a classic topic of study in physics, mathematics, and engineering. Many modern technologies are based on our understanding of waves and their interaction with matter. In the past thirty years there have been some revolutionary developments in the study of waves. The present volume is the only available source which details these developments in a systematic manner, with the aim of reaching a broad audience of non-experts. It is an important resource book for those interested in understanding the physics underlying nanotechnology and mesoscopic phenomena, as well as for bridging the gap between the textbooks and research frontiers in any wave related topic. A special feature of this volume is the treatment of classical and quantum mechanical waves within a unified framework, thus facilitating an understanding of similarities and differences between the two.
This book is a continuous learning tool for experienced technical staff facing laser vibrometry technology for the first time. The book covers both theoretical aspects and practical applications of laser Doppler vibrometry, and is accompanied by a multimedia presentation that allows the audience to browse the content and come as close as possible to performing real experiments. After a brief introduction, Chapter 2 presents supporting theory, providing general information on light sources, light scattering and interference for a better understanding of the rest of the book. Chapter 3 examines the theory of laser vibrometers, explaining interferometers from an optical perspective and in terms of the related electronics. It also addresses options like tracking filters and different signal demodulation strategies, since these have a significant impact on the practical use of vibrometers. Chapter 4 explores the configurations that are encountered in today's instrumentation, with a focus on providing practical suggestions on the use of laser vibrometers. Lastly, Chapter 5 investigates metrology for vibration and shock measurements using laser interferometry, and analyses the uncertainty of laser vibrometers in depth.
The need for automatic speech recognition systems to be robust with respect to changes in their acoustical environment has become more widely appreciated in recent years, as more systems are finding their way into practical applications. Although the issue of environmental robustness has received only a small fraction of the attention devoted to speaker independence, even speech recognition systems that are designed to be speaker independent frequently perform very poorly when they are tested using a different type of microphone or acoustical environment from the one with which they were trained. The use of microphones other than a "close talking" headset also tends to severely degrade speech recognition -performance. Even in relatively quiet office environments, speech is degraded by additive noise from fans, slamming doors, and other conversations, as well as by the effects of unknown linear filtering arising reverberation from surface reflections in a room, or spectral shaping by microphones or the vocal tracts of individual speakers. Speech-recognition systems designed for long-distance telephone lines, or applications deployed in more adverse acoustical environments such as motor vehicles, factory floors, oroutdoors demand far greaterdegrees ofenvironmental robustness. There are several different ways of building acoustical robustness into speech recognition systems. Arrays of microphones can be used to develop a directionally-sensitive system that resists intelference from competing talkers and other noise sources that are spatially separated from the source of the desired speech signal."
A major advantage of a direct digital synthesizer is that its output frequency, phase and amplitude can be precisely and rapidly manipulated under digital processor control. This book was written to find possible applications for radio communication systems. |
![]() ![]() You may like...
4th EAI International Conference on…
Lucia Knapcikova, Michal Balog, …
Hardcover
R4,615
Discovery Miles 46 150
Networked Filtering and Fusion in…
Magdi S. Mahmoud, Yuanqing Xia
Hardcover
R4,312
Discovery Miles 43 120
New Perspectives on Aspect and Modality…
Barbara Meisterernst
Hardcover
R1,616
Discovery Miles 16 160
|