Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Showing 1 - 25 of 41 matches in All Departments
With the rapid development of China and India as new economic powers in global competition, an obvious question is whether these emerging economies are great opportunities or threats. Whilst answers are bound to differ depending on one's perspective, it is increasingly clear that more local firms, especially local entrepreneurs, from these emerging economies will play a more critical role in global competition by becoming challengers to global incumbents. Indeed, the fact that the majority of their populations are at the bottom of the pyramid, and thus cannot afford products designed for the developed markets, has made these emerging economies fertile ground for developing and applying disruptive innovations. A novel mix of key attributes distinctive from those of established technologies or business models, disruptive innovations are typically inferior, yet affordable and "good-enough" products or services, which originate in lower-end market segments, but later move up to compete with those provided by incumbent firms. This book sheds new light on disruptive innovations both from and for the bottom of the pyramid in China and India, from the point of view of local entrepreneurs and international firms seeking to operate their businesses there. It covers both the theoretical and practical implications of disruptive innovation using conceptual frameworks alongside detailed case studies, whilst also providing a comparison of conditions and strategic options in India and China. Further, unlike existing studies, this book focuses on the neglected perspective of local challengers as the primary players, and in doing so reveals the extent to which the future landscape of global competition may be shaped by disruptive innovation, as well as its capacity to make the world "flatter" and more sustainable. This unique book will be valuable to both scholars and practitioners interested in disruptive innovation and those working in the fields of Asian studies, international business, economics and globalization.
With the rapid development of China and India as new economic powers in global competition, an obvious question is whether these emerging economies are great opportunities or threats. Whilst answers are bound to differ depending on one s perspective, it is increasingly clear that more local firms, especially local entrepreneurs, from these emerging economies will play a more critical role in global competition by becoming challengers to global incumbents. Indeed, the fact that the majority of their populations are at the bottom of the pyramid, and thus cannot afford products designed for the developed markets, has made these emerging economies fertile ground for developing and applying disruptive innovations. A novel mix of key attributes distinctive from those of established technologies or business models, disruptive innovations are typically inferior, yet affordable and "good-enough" products or services, which originate in lower-end market segments, but later move up to compete with those provided by incumbent firms. This book sheds new light on disruptive innovations both from and for the bottom of the pyramid in China and India, from the point of view of local entrepreneurs and international firms seeking to operate their businesses there. It covers both the theoretical and practical implications of disruptive innovation using conceptual frameworks alongside detailed case studies, whilst also providing a comparison of conditions and strategic options in India and China. Further, unlike existing studies, this book focuses on the neglected perspective of local challengers as the primary players, and in doing so reveals the extent to which the future landscape of global competition may be shaped by disruptive innovation, as well as its capacity to make the world "flatter" and more sustainable. This unique book will be valuable to both scholars and practitioners interested in disruptive innovation and those working in the fields of Asian studies, international business, economics and globalization.
This book addresses important issues of speech processing and language learning in Chinese. It highlights perception and production of speech in healthy and clinical populations and in children and adults. This book provides diverse perspectives and reviews of cutting-edge research in past decades on how Chinese speech is processed and learned. Along with each chapter, future research directions have been discussed. With these unique features and the broad coverage of topics, this book appeals to not only scholars and students who study speech perception in preverbal infants and in children and adults learning Chinese, but also to teachers with interests in pedagogical applications in teaching Chinese as Second Language.
This book addresses important issues of speech processing and language learning in Chinese. It highlights perception and production of speech in healthy and clinical populations and in children and adults. This book provides diverse perspectives and reviews of cutting-edge research in past decades on how Chinese speech is processed and learned. Along with each chapter, future research directions have been discussed. With these unique features and the broad coverage of topics, this book appeals to not only scholars and students who study speech perception in preverbal infants and in children and adults learning Chinese, but also to teachers with interests in pedagogical applications in teaching Chinese as Second Language.
This book provides a thorough overview of the evolution of privacy-preserving machine learning schemes over the last ten years, after discussing the importance of privacy-preserving techniques. In response to the diversity of Internet services, data services based on machine learning are now available for various applications, including risk assessment and image recognition. In light of open access to datasets and not fully trusted environments, machine learning-based applications face enormous security and privacy risks. In turn, it presents studies conducted to address privacy issues and a series of proposed solutions for ensuring privacy protection in machine learning tasks involving multiple parties. In closing, the book reviews state-of-the-art privacy-preserving techniques and examines the security threats they face.
This book focuses on the competitive situation and policy outlook of China's provincial economy in the 13th five-year period. It begins with a general evaluation report on the country's provincial comprehensive Economic Competitiveness, followed by analyses at the international, national and regional levels, industrial and enterprise levels. On the basis of domestic and international research findings, it further enriches our understanding of provincial competitiveness, analyzes the domestic and international situation, explores new changes, new norms, new situations and new challenges concerning China's provincial economy in the past few years, reveals the characteristics and relative differences of different types, defines their internal competitive strengths and weaknesses, and provides valuable theoretical content to guide decision-making.
This book provides a detailed and up-to- date analysis of the current and near-future domestic economic situation in China based on the concept of "New Normal", which was first proposed by Chinese President Xi Jinping and which is commonly used in discussions on China's current economy. China's New Normal is the result of the growing pressures on domestic resources, environmental restrictions, and unstable international economic recovery and characterized by a moderate economic increase, a proper increase in commodity pricing, stabilizing new employment and optimizing economic structure. The book argues that while China focuses on stability and quality in macro-control and enhancing reform and innovation, many contradictions and problems in economic operations are gradually being solved, therefore optimizing the economic structure. The book explores many aspects of China's economic development under the "New Normal" while making analysis and policy suggestions for the present economic trends.
This book provides a detailed and up-to- date analysis of the current and near-future domestic economic situation in China based on the concept of "New Normal", which was first proposed by Chinese President Xi Jinping and which is commonly used in discussions on China's current economy. China's New Normal is the result of the growing pressures on domestic resources, environmental restrictions, and unstable international economic recovery and characterized by a moderate economic increase, a proper increase in commodity pricing, stabilizing new employment and optimizing economic structure. The book argues that while China focuses on stability and quality in macro-control and enhancing reform and innovation, many contradictions and problems in economic operations are gradually being solved, therefore optimizing the economic structure. The book explores many aspects of China's economic development under the "New Normal" while making analysis and policy suggestions for the present economic trends.
The book shows how eastern and western perspectives and conceptions can be used to addresses recent topics laying at the crossroad between philosophy and cognitive science. It reports on new points of view and conceptions discussed during the International Conference on Philosophy and Cognitive Science (PCS2013), held at the Sun Yat-sen University, in Guangzhou, China, and the 2013 Workshop on Abductive Visual Cognition, which took place at KAIST, in Deajeon, South Korea. The book emphasizes an ever-growing cultural exchange between academics and intellectuals coming from different fields. It juxtaposes research works investigating new facets on key issues between philosophy and cognitive science, such as the role of models and causal representations in science; the status of theoretical concepts and quantum principles; abductive cognition, vision, and visualization in science from an eco-cognitive perspective. Further topics are: ignorance immunization in reasoning; moral cognition, violence, and epistemology; and models and biomorphism. The book, which presents a unique and timely account of the current state-of-the art on various aspects in philosophy and cognitive science, is expected to inspire philosophers, cognitive scientists and social scientists, and to generate fruitful exchanges and collaboration among them.
The book shows how eastern and western perspectives and conceptions can be used to addresses recent topics laying at the crossroad between philosophy and cognitive science. It reports on new points of view and conceptions discussed during the International Conference on Philosophy and Cognitive Science (PCS2013), held at the Sun Yat-sen University, in Guangzhou, China, and the 2013 Workshop on Abductive Visual Cognition, which took place at KAIST, in Deajeon, South Korea. The book emphasizes an ever-growing cultural exchange between academics and intellectuals coming from different fields. It juxtaposes research works investigating new facets on key issues between philosophy and cognitive science, such as the role of models and causal representations in science; the status of theoretical concepts and quantum principles; abductive cognition, vision, and visualization in science from an eco-cognitive perspective. Further topics are: ignorance immunization in reasoning; moral cognition, violence, and epistemology; and models and biomorphism. The book, which presents a unique and timely account of the current state-of-the art on various aspects in philosophy and cognitive science, is expected to inspire philosophers, cognitive scientists and social scientists, and to generate fruitful exchanges and collaboration among them.
The book addresses a number of recent topics at the crossroad of philosophy and cognitive science, taking advantage of both the western and the eastern perspectives and conceptions that emerged and were discussed at the PCS2011 Conference recently held in Guangzhou. The ever growing cultural exchange between academics and intellectual belonging to different cultures is reverberated by the juxtaposition of papers, which aim at investigating new facets of crucial problems in philosophy: the role of models in science and the fictional approach; chance seeking dynamics and how affordances work; abductive cognition; visualization in science; the cognitive structure of scientific theories; scientific representation; mathematical representation in science; model-based reasoning; analogical reasoning; moral cognition; cognitive niches and evolution.
Optimization techniques have been widely adopted to implement various data mining algorithms. In addition to well-known Support Vector Machines (SVMs) (which are based on quadratic programming), different versions of Multiple Criteria Programming (MCP) have been extensively used in data separations. Since optimization based data mining methods differ from statistics, decision tree induction, and neural networks, their theoretical inspiration has attracted many researchers who are interested in algorithm development of data mining. Optimization based Data Mining: Theory and Applications, mainly focuses on MCP and SVM especially their recent theoretical progress and real-life applications in various fields. These include finance, web services, bio-informatics and petroleum engineering, which has triggered the interest of practitioners who look for new methods to improve the results of data mining for knowledge discovery. Most of the material in this book is directly from the research and application activities that the authors' research group has conducted over the last ten years. Aimed at practitioners and graduates who have a fundamental knowledge in data mining, it demonstrates the basic concepts and foundations on how to use optimization techniques to deal with data mining problems.
The book addresses a number of recent topics at the crossroad of philosophy and cognitive science, taking advantage of both the western and the eastern perspectives and conceptions that emerged and were discussed at the PCS2011 Conference recently held in Guangzhou. The ever growing cultural exchange between academics and intellectual belonging to different cultures is reverberated by the juxtaposition of papers, which aim at investigating new facets of crucial problems in philosophy: the role of models in science and the fictional approach; chance seeking dynamics and how affordances work; abductive cognition; visualization in science; the cognitive structure of scientific theories; scientific representation; mathematical representation in science; model-based reasoning; analogical reasoning; moral cognition; cognitive niches and evolution.
For a given meromorphic function I(z) and an arbitrary value a, Nevanlinna's value distribution theory, which can be derived from the well known Poisson-Jensen for mula, deals with relationships between the growth of the function and quantitative estimations of the roots of the equation: 1 (z) - a = O. In the 1920s as an application of the celebrated Nevanlinna's value distribution theory of meromorphic functions, R. Nevanlinna [188] himself proved that for two nonconstant meromorphic func tions I, 9 and five distinctive values ai (i = 1,2,3,4,5) in the extended plane, if 1 1- (ai) = g-l(ai) 1M (ignoring multiplicities) for i = 1,2,3,4,5, then 1 = g. Fur 1 thermore, if 1- (ai) = g-l(ai) CM (counting multiplicities) for i = 1,2,3 and 4, then 1 = L(g), where L denotes a suitable Mobius transformation. Then in the 19708, F. Gross and C. C. Yang started to study the similar but more general questions of two functions that share sets of values. For instance, they proved that if 1 and 9 are two nonconstant entire functions and 8 , 82 and 83 are three distinctive finite sets such 1 1 that 1- (8 ) = g-1(8 ) CM for i = 1,2,3, then 1 = g.
Optimization techniques have been widely adopted to implement various data mining algorithms. In addition to well-known Support Vector Machines (SVMs) (which are based on quadratic programming), different versions of Multiple Criteria Programming (MCP) have been extensively used in data separations. Since optimization based data mining methods differ from statistics, decision tree induction, and neural networks, their theoretical inspiration has attracted many researchers who are interested in algorithm development of data mining. "Optimization based Data Mining: Theory and Applications," mainly focuses on MCP and SVM especially their recent theoretical progress and real-life applications in various fields. These include finance, web services, bio-informatics and petroleum engineering, which has triggered the interest of practitioners who look for new methods to improve the results of data mining for knowledge discovery. Most of the material in this book is directly from the research and application activities that the authors' research group has conducted over the last ten years. Aimed at practitioners and graduates who have a fundamental knowledge in data mining, it demonstrates the basic concepts and foundations on how to use optimization techniques to deal with data mining problems.
MCDM 2009, the 20th International Conference on Multiple-Criteria Decision M- ing, emerged as a global forum dedicated to the sharing of original research results and practical development experiences among researchers and application developers from different multiple-criteria decision making-related areas such as multiple-criteria decision aiding, multiple criteria classification, ranking, and sorting, multiple obj- tive continuous and combinatorial optimization, multiple objective metaheuristics, multiple-criteria decision making and preference modeling, and fuzzy multiple-criteria decision making. The theme for MCDM 2009 was "New State of MCDM in the 21st Century." The conference seeks solutions to challenging problems facing the development of multiple-criteria decision making, and shapes future directions of research by prom- ing high-quality, novel and daring research findings. With the MCDM conference, these new challenges and tools can easily be shared with the multiple-criteria decision making community. The workshop program included nine workshops which focused on different topics in new research challenges and initiatives of MCDM. We received more than 350 submissions for all the workshops, out of which 121 were accepted. This includes 72 regular papers and 49 short papers. We would like to thank all workshop organizers and the Program Committee for the excellent work in maintaining the conference's standing for high-quality papers.
The volume is based on papers presented at the international conference on Model-Based Reasoning in Science and Medicine held in China in 2006. The presentations explore how scientific thinking uses models and explanatory reasoning to produce creative changes in theories and concepts. The contributions to the book are written by researchers active in the area of creative reasoning in science and technology. They include the subject area 's most recent results and achievements.
For a given meromorphic function I(z) and an arbitrary value a, Nevanlinna's value distribution theory, which can be derived from the well known Poisson-Jensen for mula, deals with relationships between the growth of the function and quantitative estimations of the roots of the equation: 1 (z) - a = O. In the 1920s as an application of the celebrated Nevanlinna's value distribution theory of meromorphic functions, R. Nevanlinna [188] himself proved that for two nonconstant meromorphic func tions I, 9 and five distinctive values ai (i = 1,2,3,4,5) in the extended plane, if 1 1- (ai) = g-l(ai) 1M (ignoring multiplicities) for i = 1,2,3,4,5, then 1 = g. Fur 1 thermore, if 1- (ai) = g-l(ai) CM (counting multiplicities) for i = 1,2,3 and 4, then 1 = L(g), where L denotes a suitable Mobius transformation. Then in the 19708, F. Gross and C. C. Yang started to study the similar but more general questions of two functions that share sets of values. For instance, they proved that if 1 and 9 are two nonconstant entire functions and 8 , 82 and 83 are three distinctive finite sets such 1 1 that 1- (8 ) = g-1(8 ) CM for i = 1,2,3, then 1 = g.
This book provides a state-of-the-art review of the acquisition of lexical and grammatical aspect, in both first and second language acquisition. More specifically, it presents a comprehensive analysis of how child and adult speakers learn to mark aspect, an important subsystem of language that marks the temporal contour of events by means of inherent lexical meanings and/or grammatical morphology (in contrast to tense which marks the temporal location of events with respect to past, present, and future). Readers from linguistics, psychology, language acquisition, language education, and cognitive science should all find this book a relevant and important text for their research and teaching.
This book focuses on discussing the issues of rating scheme design and risk aggregation of risk matrix, which is a popular risk assessment tool in many fields. Although risk matrix is usually treated as qualitative tool, this book conducts the analysis from the quantitative perspective. The discussed content belongs to the scope of risk management, and to be more specific, it is related to quick risk assessment. This book is suitable for the researchers and practitioners related to qualitative or quick risk assessment and highly helps readers understanding how to design more convincing risk assessment tools and do more accurate risk assessment in a uncertain context.
This book focuses on discussing the issues of rating scheme design and risk aggregation of risk matrix, which is a popular risk assessment tool in many fields. Although risk matrix is usually treated as qualitative tool, this book conducts the analysis from the quantitative perspective. The discussed content belongs to the scope of risk management, and to be more specific, it is related to quick risk assessment. This book is suitable for the researchers and practitioners related to qualitative or quick risk assessment and highly helps readers understanding how to design more convincing risk assessment tools and do more accurate risk assessment in a uncertain context.
This book proposes a bank risk aggregation framework based on financial statements. Specifically, bank risk aggregation is of great importance to maintain stable operation of banking industry and prevent financial crisis. A major obstacle to bank risk management is the problem of data shortage, which makes many quantitative risk aggregation approaches typically fail. Recently, to overcome the problem of inaccurate total risk results caused by the shortage of risk data, some researchers have proposed a series of financial statements-based bank risk aggregation approaches. However, the existing studies have drawbacks of low frequency and time lag of financial statements data and usually ignore off-balance sheet business risk in bank risk aggregation. Thus, by reviewing the research progress in bank risk aggregation based on financial statements and improving the drawbacks of existing methods, this book proposes a bank risk aggregation framework based on financial statements. It makes full use of information recorded in financial statements, including income statement, on- and off-balance sheet assets, and textual risk disclosures, which solves the problem of data shortage in bank risk aggregation to some extent and improves the reliability and rationality of bank risk aggregation results. This book not only improves the theoretical studies of bank risk aggregation, but also provides an important support for the capital allocation of the banking industry in practice. Thus, this book has theoretical and practical importance for bank managers and researchers of bank risk management.
This book proposes a bank risk aggregation framework based on financial statements. Specifically, bank risk aggregation is of great importance to maintain stable operation of banking industry and prevent financial crisis. A major obstacle to bank risk management is the problem of data shortage, which makes many quantitative risk aggregation approaches typically fail. Recently, to overcome the problem of inaccurate total risk results caused by the shortage of risk data, some researchers have proposed a series of financial statements-based bank risk aggregation approaches. However, the existing studies have drawbacks of low frequency and time lag of financial statements data and usually ignore off-balance sheet business risk in bank risk aggregation. Thus, by reviewing the research progress in bank risk aggregation based on financial statements and improving the drawbacks of existing methods, this book proposes a bank risk aggregation framework based on financial statements. It makes full use of information recorded in financial statements, including income statement, on- and off-balance sheet assets, and textual risk disclosures, which solves the problem of data shortage in bank risk aggregation to some extent and improves the reliability and rationality of bank risk aggregation results. This book not only improves the theoretical studies of bank risk aggregation, but also provides an important support for the capital allocation of the banking industry in practice. Thus, this book has theoretical and practical importance for bank managers and researchers of bank risk management.
Climate change mechanisms, impacts, risks, mitigation, adaption, and governance are widely recognized as the biggest, most interconnected problem facing humanity. Big Data Mining for Climate Change addresses one of the fundamental issues facing scientists of climate or the environment: how to manage the vast amount of information available and analyse it. The resulting integrated and interdisciplinary big data mining approaches are emerging, partially with the help of the United Nation's big data climate challenge, some of which are recommended widely as new approaches for climate change research. Big Data Mining for Climate Change delivers a rich understanding of climate-related big data techniques and highlights how to navigate huge amount of climate data and resources available using big data applications. It guides future directions and will boom big-data-driven researches on modeling, diagnosing and predicting climate change and mitigating related impacts. This book mainly focuses on climate network models, deep learning techniques for climate dynamics, automated feature extraction of climate variability, and sparsification of big climate data. It also includes a revelatory exploration of big-data-driven low-carbon economy and management. Its content provides cutting-edge knowledge for scientists and advanced students studying climate change from various disciplines, including atmospheric, oceanic and environmental sciences; geography, ecology, energy, economics, management, engineering, and public policy. |
You may like...
|