Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Showing 1 - 17 of 17 matches in All Departments
The behavior of polymer solutions in simple shear flows has been the subject of considerable research in the past. On the other hand, reports on polymers in elongational flow have appeared comparatively recently in the literature. Elongational flow with an inherent low vorticity is known to be more effective in extending polymer chains than simple shear flow and thus is more interesting from the point of view of basic (molecular chain dynamics at high deformation) and applied polymer science (rheology, fiber extrusion, drag reduction, flow through porous media). Undoubtly, one landmark in the field of polymer dynamics in elongational flow was the notion of critical strain-rate for chain extension, initially put forward by A. Peterlin (1966) and later refined into the "coil-stretching" transition by P. G. de Gennes and H. Hinch (1974). In the two decades which followed, significant progress in the understanding of chain conformation in "strong" flow has been accomplished through a combination of advances in instrumentation, computation techniques and theoretical studies. As a result of the multidisciplinary nature of the field, information on polymer chains in "strong" flow is accessible only from reviews and research papers scattered in disparate scientific journals. An important objective of this book is to remedy that situation by providing the reader with up-to-date knowledge in a single volume. The editors therefore invited leading specialists to provide both fundamental and applied information on the multiple facets of chain deformation in elongational flow.
This proceedings volume contains 39 papers presented at the IUTAM Sym- posium on Variations of Domains and Free Boundary Problems in Solid Mechanics, held in Paris from April 22nd to 25th 1997, at the Ecole des Mines and the Ecole Poly technique. This symposium offered an opportunity for researchers from all engineering disciplines and applied mathematics to review the state of the art and to identify new trends and new features in the field. Mechanical modelling, mathematical discussion and numerical resolution have been the primary goals of the meeting. Principal subjects of discussion concerned ground freezing, shape memory alloys, crystal growth, phase change in solids, piezo-electricity, wavelets, delamination, damage, fracture mechanics, polymerization, adhesion, fric- tion, porous media, nucleation, plasticity, inverse problems, and topological optimization. More than 80 scientists of different nationalities participated in this sym- posium. Efforts of many people made this symposium possible. We would like to thank all the authors and participants for their contributions and the members of the Scientific Committee for their patronage and assistance in selecting papers. The effectiveness of the Organizing Committee is ac- knowledged. We are pleased to thank all the involved members of the two Laboratories : Laboratoire de Mecanique des Solides and Laboratoire des Materiaux et des Structures du Genie Civil, specially Valerie Fran
Reliably optimizing a new treatment in humans is a critical first step in clinical evaluation since choosing a suboptimal dose or schedule may lead to failure in later trials. At the same time, if promising preclinical results do not translate into a real treatment advance, it is important to determine this quickly and terminate the clinical evaluation process to avoid wasting resources. Bayesian Designs for Phase I-II Clinical Trials describes how phase I-II designs can serve as a bridge or protective barrier between preclinical studies and large confirmatory clinical trials. It illustrates many of the severe drawbacks with conventional methods used for early-phase clinical trials and presents numerous Bayesian designs for human clinical trials of new experimental treatment regimes. Written by research leaders from the University of Texas MD Anderson Cancer Center, this book shows how Bayesian designs for early-phase clinical trials can explore, refine, and optimize new experimental treatments. It emphasizes the importance of basing decisions on both efficacy and toxicity.
Reliably optimizing a new treatment in humans is a critical first step in clinical evaluation since choosing a suboptimal dose or schedule may lead to failure in later trials. At the same time, if promising preclinical results do not translate into a real treatment advance, it is important to determine this quickly and terminate the clinical evaluation process to avoid wasting resources. Bayesian Designs for Phase I-II Clinical Trials describes how phase I-II designs can serve as a bridge or protective barrier between preclinical studies and large confirmatory clinical trials. It illustrates many of the severe drawbacks with conventional methods used for early-phase clinical trials and presents numerous Bayesian designs for human clinical trials of new experimental treatment regimes. Written by research leaders from the University of Texas MD Anderson Cancer Center, this book shows how Bayesian designs for early-phase clinical trials can explore, refine, and optimize new experimental treatments. It emphasizes the importance of basing decisions on both efficacy and toxicity.
Data compression is one of the main contributing factors in the explosive growth in information technology. Without it, a number of consumer and commercial products, such as DVD, videophone, digital camera, MP3, video-streaming and wireless PCS, would have been virtually impossible. Transforming the data to a frequency or other domain enables even more efficient compression. By illustrating this intimate link, The Transform and Data Compression Handbook serves as a much-needed handbook for a wide range of researchers and engineers.
This book constitutes the proceedings of the 33rd Annual International Conference on the Theory and Applications of Cryptographic Techniques, EUROCRYPT 2014, held in Copenhagen, Denmark, in May 2014. The 38 full papers included in this volume were carefully reviewed and selected from 197 submissions. They deal with public key cryptanalysis, identity-based encryption, key derivation and quantum computing, secret-key analysis and implementations, obfuscation and multi linear maps, authenticated encryption, symmetric encryption, multi-party encryption, side-channel attacks, signatures and public-key encryption, functional encryption, foundations and multi-party computation.
This book constitutes the proceedings of the 32nd Annual International Conference on the Theory and Applications of Cryptographic Techniques, EUROCRYPT 2013, held in Athens, Greece, in May 2013. The 41 full papers included in this volume were carefully reviewed and selected from 201 submissions. They deal with cryptanalysis of hash functions, side-channel attacks, number theory, lattices, public key encryption, digital signatures, homomorphic cryptography, quantum cryptography, storage, tools, and secure computation.
The behavior of polymer solutions in simple shear flows has been the subject of considerable research in the past. On the other hand, reports on polymers in elongational flow have appeared comparatively recently in the literature. Elongational flow with an inherent low vorticity is known to be more effective in extending polymer chains than simple shear flow and thus is more interesting from the point of view of basic (molecular chain dynamics at high deformation) and applied polymer science (rheology, fiber extrusion, drag reduction, flow through porous media). Undoubtly, one landmark in the field of polymer dynamics in elongational flow was the notion of critical strain-rate for chain extension, initially put forward by A. Peterlin (1966) and later refined into the "coil-stretching" transition by P. G. de Gennes and H. Hinch (1974). In the two decades which followed, significant progress in the understanding of chain conformation in "strong" flow has been accomplished through a combination of advances in instrumentation, computation techniques and theoretical studies. As a result of the multidisciplinary nature of the field, information on polymer chains in "strong" flow is accessible only from reviews and research papers scattered in disparate scientific journals. An important objective of this book is to remedy that situation by providing the reader with up-to-date knowledge in a single volume. The editors therefore invited leading specialists to provide both fundamental and applied information on the multiple facets of chain deformation in elongational flow.
This proceedings volume contains 39 papers presented at the IUTAM Sym- posium on Variations of Domains and Free Boundary Problems in Solid Mechanics, held in Paris from April 22nd to 25th 1997, at the Ecole des Mines and the Ecole Poly technique. This symposium offered an opportunity for researchers from all engineering disciplines and applied mathematics to review the state of the art and to identify new trends and new features in the field. Mechanical modelling, mathematical discussion and numerical resolution have been the primary goals of the meeting. Principal subjects of discussion concerned ground freezing, shape memory alloys, crystal growth, phase change in solids, piezo-electricity, wavelets, delamination, damage, fracture mechanics, polymerization, adhesion, fric- tion, porous media, nucleation, plasticity, inverse problems, and topological optimization. More than 80 scientists of different nationalities participated in this sym- posium. Efforts of many people made this symposium possible. We would like to thank all the authors and participants for their contributions and the members of the Scientific Committee for their patronage and assistance in selecting papers. The effectiveness of the Organizing Committee is ac- knowledged. We are pleased to thank all the involved members of the two Laboratories : Laboratoire de Mecanique des Solides and Laboratoire des Materiaux et des Structures du Genie Civil, specially Valerie Fran
Computational aspects of geometry of numbers have been revolutionized by the Lenstra-Lenstra-Lovasz ' lattice reduction algorithm (LLL), which has led to bre- throughs in elds as diverse as computer algebra, cryptology, and algorithmic number theory. After its publication in 1982, LLL was immediately recognized as one of the most important algorithmic achievements of the twentieth century, because of its broad applicability and apparent simplicity. Its popularity has kept growing since, as testi ed by the hundreds of citations of the original article, and the ever more frequent use of LLL as a synonym to lattice reduction. As an unfortunate consequence of the pervasiveness of the LLL algorithm, researchers studying and applying it belong to diverse scienti c communities, and seldom meet. While discussing that particular issue with Damien Stehle ' at the 7th Algorithmic Number Theory Symposium (ANTS VII) held in Berlin in July 2006, John Cremona accuratelyremarkedthat 2007would be the 25th anniversaryof LLL and this deserveda meetingto celebrate that event. The year 2007was also involved in another arithmetical story. In 2003 and 2005, Ali Akhavi, Fabien Laguillaumie, and Brigitte Vallee ' with other colleagues organized two workshops on cryptology and algorithms with a strong emphasis on lattice reduction: CAEN '03 and CAEN '05, CAEN denoting both the location and the content (Cryptologie et Algori- miqueEn Normandie). Veryquicklyafterthe ANTSconference,AliAkhavi,Fabien Laguillaumie, and Brigitte Vallee ' were thus readily contacted and reacted very enthusiastically about organizing the LLL birthday conference. The organization committee was formed.
This book constitutes the thoroughly refereed post-proceedings of the First International Conference on Cryptology in Vietnam, VIETCRYPT 2006, held in Hanoi, Vietnam, in September 2006. The 24 revised full papers presented together with 1 invited paper were carefully reviewed and selected from 78 submissions. The papers are organized in topical sections on signatures and lightweight cryptography, pairing-based cryptography, algorithmic number theory, ring signatures and group signatures, hash functions, cryptanalysis, key agreement and threshold cryptography, as well as public-key encryption.
The first theme concerns the plastic buckling of structures in the spirit of Hilla (TM)s classical approach. Non-bifurcation and stability criteria are introduced and post-bifurcation analysis performed by asymptotic development method in relation with Hutchinsona (TM)s work. Some recent results on the generalized standard model are given and their connection to Hilla (TM)s general formulation is presented. Instability phenomena of inelastic flow processes such as strain localization and necking are discussed. The second theme concerns stability and bifurcation problems in internally damaged or cracked colids. In brittle fracture or brittle damage, the evolution law of crack lengths or damage parameters is time-independent like in plasticity and leads to a similar mathematical description of the quasi-static evolution. Stability and non-bifurcation criteria in the sense of Hill can be again obtained from the discussion of the rate response.
Although the problem of stability and bifurcation is well
understood in Mechanics, very few treatises have been devoted to
stability and bifurcation analysis in dissipative media, in
particular with regard to present and fundamental problems in Solid
Mechanics such as plasticity, fracture and contact mechanics.
Stability and Nonlinear Solid Mechanics addresses this lack of
material, and proposes to the reader not only a unified
presentation of nonlinear problems in Solid Mechanics, but also a
complete and unitary analysis on stability and bifurcation problems
arising within this framework. Main themes include:
This book constitutes the refereed proceedings of the 20th International Conference on Information Security, ISC 2017, held in Ho Chi Minh City, Vietnam, in November 2017. The 25 revised full papers presented were carefully reviewed and selected from 97 submissions. The papers are organized in topical sections on symmetric cryptography, post-quantum cryptography, public-key cryptography, authentication, attacks, privacy, mobile security, software security, and network and system security.
This research is aimed at improving the state of the art of GPS algorithms, namely, the development of a closed-form positioning algorithm for a stand-alone user and the development of a novel differential GPS algorithm for a network of users. The stand-alone user GPS algorithm is a direct, closed-form, and efficient new position determination algorithm that exploits the closed-form solution of the GPS trilateration equations and works in the presence of pseudorange measurement noise for an arbitrary number of satellites in view. A two-step GPS position determination algorithm is derived which entails the solution of a linear regression and updates the solution based on one nonlinear measurement equation. In this algorithm, only two or three iterations are required as opposed to five iterations that are normally required in the standard Iterative Least Squares (ILS) algorithm currently used. The mathematically derived stochastic model-based solution algorithm for the GPS pseudorange equations is also assessed and compared to the conventional ILS algorithm. Good estimation performance is achieved, even under high Geometric Dilution of Precision (GDOP) conditions. The novel differential GPS algorithm for a network of users that has been developed in this research uses a Kinematic Differential Global Positioning System (KDGPS) approach. A network of mobile receivers is considered, one of which will be designated the 'reference station' which will have known position and velocity information at the beginning of the time interval being examined. The measurement situation on hand is properly modeled, and a centralized estimation algorithm processing several epochs of data is developed.
Um Innovationen zu generieren, sind Investitionen in die Forschung und Entwicklung (F&E) n tig. Gerade wegen der hohen Bedeutung von neuen Technologien f r die Wettbewerbsf higkeit der Unternehmen sollte der F&E-Prozess besonders gut gesch tzt werden. Doch in dieser Hinsicht hat sich ein neuer Trend entwickelt: Immer mehr Unternehmen sind bereit, ihre F&E gemeinschaftlich - selbst mit Konkurrenten - durchzuf hren und ihr Wissen notgedrungen miteinander zu teilen. Parallel mit dem Trend zu F&E-Kooperationen hat das staatliche Engagement in der F rderung von kollaborativer Forschung zugenommen. In den gro en Industriel ndern haben die Regierungen begonnen, F&E-Kooperationen ungeachtet ihrer wettbewerbsrechtlichen Zweifelhaftigkeit massiv voranzutreiben. Welche Motive k nnen die Unternehmen haben, die F&E zusammen mit ihren Konkurrenten zu betreiben? Und welches Interesse hat der Staat daran, solche F&E-Kooperationen noch zu unterst tzen? Spillover spielen eine wichtige Rolle f r die Entstehung von F&E-Kooperationen. Anhand verschiedener Modelle werden die Anreize zur Bildung sowie die Auswirkung von F&E-Kooperationen auf den Wettbewerb analysiert. Im praktischen Teil dieses Buches werden die F&E-Kooperationen an einem konkreten Beispiel untersucht. Die Halbleiterindustrie bietet sich deshalb so gut als Fallbeispiel an, da sie als eine der dynamischsten und forschungsintensivsten Industrien gilt.
Diplomarbeit aus dem Jahr 2004 im Fachbereich BWL - Unternehmensfuhrung, Management, Organisation, Note: 1,3, Universitat Hamburg (Wirtschaftswissenschaften), Sprache: Deutsch, Abstract: Inhaltsangabe: Einleitung: Innovationen gelten als die Grundlage fur die langfristige Wettbewerbsfahigkeit einzelner Unternehmen sowie gesamter Volkswirtschaften. Durch neue Produkte und Prozesse gewinnt das innovative Unternehmen entscheidende Wettbewerbsvorteile gegenuber der Konkurrenz. Fur eine Volkswirtschaft sind Innovationen eine Voraussetzung fur nachhaltiges Wachstum und damit Sicherung von Arbeitsplatzen und Einkommen. Um Innovationen zu generieren sind Investitionen in die Forschung und Entwicklung (F&E) notig. Gerade wegen der hohen Bedeutung von neuen Technologien fur die Wettbewerbsfahigkeit der Unternehmen sollte der F&E-Prozess besonders gut geschutzt werden. D.h. intuitiv ware eine Abschottung der eigenen F&E-Aktivitat gegenuber Konkurrenten eine logische Konsequenz. Doch in dieser Hinsicht hat sich in den vergangenen zwei Dekaden ein neuer Trend entwickelt. Entgegen der Intuition sind immer mehr Unternehmen bereit ihre F&E gemeinschaftlich - selbst mit Konkurrenten - durchzufuhren und ihr Wissen miteinander zu teilen. Hagedoorn (2002) hat in seiner empirischen Studie festgestellt, dass seit Beginn der 80er Jahre die Anzahl der F&E-Kooperationen drastisch angestiegen ist. Besonders auffallig ist, dass vor allem in den forschungsintensiven Industrien wie z.B. dem IT-Sektor sich die gemeinschaftliche Forschung immer starker durchsetzt. Das ist insofern uberraschend, da in diesen Industrien Innovationen eine weitaus grossere Bedeutung fur die Wettbewerbsfahigkeit spielen als z.B. in Sektoren mit einer geringen Forschungsintensitat. Man wurde vermuten, dass im IT-Sektor neue Technologien noch starker vor der Konkurrenz abgeschirmt werden als anderswo. Eine andere Auffalligkeit diesbezuglich ist auch in der Wirtschaftspolitik verschiedener Staaten zu erkennen. Fast paral
|
You may like...
|