0
Your cart

Your cart is empty

Browse All Departments
  • All Departments
Price
  • R1,000 - R2,500 (7)
  • R2,500 - R5,000 (3)
  • R5,000 - R10,000 (1)
  • -
Status
Brand

Showing 1 - 11 of 11 matches in All Departments

Switching Arc Phenomena in Transmission Voltage Level Vacuum Circuit Breakers (Paperback, 1st ed. 2021): Zhiyuan Liu, Jianhua... Switching Arc Phenomena in Transmission Voltage Level Vacuum Circuit Breakers (Paperback, 1st ed. 2021)
Zhiyuan Liu, Jianhua Wang, Yingsan Geng, Zhenxing Wang
R3,512 Discovery Miles 35 120 Ships in 10 - 15 working days

Vacuum circuit breakers are widely used in distribution power systems for their advantages such as maintenance free and eco-friendly. Nowadays, most circuit breakers used at transmission voltage level are SF6 circuit breakers, but the SF6 they emit is one of the six greenhouse gases defined in Kyoto Protocol. Therefore, the development of transmission voltage level vacuum circuit breaker can help the environment. The switching arc phenomena in transmission voltage level vacuum circuit breakers are key issues to explore. This book focuses on the high-current vacuum arcs phenomena at transmission voltage level, especially on the anode spot phenomena, which significantly influence the success or failure of the short circuit current interruption. Then, it addresses the dielectric recovery property in current interruption. Next it explains how to determine the closing/opening displacement curve of transmission voltage level vacuum circuit breakers based on the vacuum arc phenomena. After that, it explains how to determine key design parameters for vacuum interrupters and vacuum circuit breakers at transmission voltage level. At the end, the most challenging issue for vacuum circuit breakers, capacitive switching in vacuum, is addressed. The contents of this book will benefit researchers and engineers in the field of power engineering, especially in the field of power circuit breakers and power switching technology.

Switching Arc Phenomena in Transmission Voltage Level Vacuum Circuit Breakers (Hardcover, 1st ed. 2021): Zhiyuan Liu, Jianhua... Switching Arc Phenomena in Transmission Voltage Level Vacuum Circuit Breakers (Hardcover, 1st ed. 2021)
Zhiyuan Liu, Jianhua Wang, Yingsan Geng, Zhenxing Wang
R5,327 Discovery Miles 53 270 Ships in 10 - 15 working days

Vacuum circuit breakers are widely used in distribution power systems for their advantages such as maintenance free and eco-friendly. Nowadays, most circuit breakers used at transmission voltage level are SF6 circuit breakers, but the SF6 they emit is one of the six greenhouse gases defined in Kyoto Protocol. Therefore, the development of transmission voltage level vacuum circuit breaker can help the environment. The switching arc phenomena in transmission voltage level vacuum circuit breakers are key issues to explore. This book focuses on the high-current vacuum arcs phenomena at transmission voltage level, especially on the anode spot phenomena, which significantly influence the success or failure of the short circuit current interruption. Then, it addresses the dielectric recovery property in current interruption. Next it explains how to determine the closing/opening displacement curve of transmission voltage level vacuum circuit breakers based on the vacuum arc phenomena. After that, it explains how to determine key design parameters for vacuum interrupters and vacuum circuit breakers at transmission voltage level. At the end, the most challenging issue for vacuum circuit breakers, capacitive switching in vacuum, is addressed. The contents of this book will benefit researchers and engineers in the field of power engineering, especially in the field of power circuit breakers and power switching technology.

Chinese Computational Linguistics and Natural Language Processing Based on Naturally Annotated Big Data - 17th China National... Chinese Computational Linguistics and Natural Language Processing Based on Naturally Annotated Big Data - 17th China National Conference, CCL 2018, and 6th International Symposium, NLP-NABD 2018, Changsha, China, October 19-21, 2018, Proceedings (Paperback, 1st ed. 2018)
Maosong Sun, Ting Liu, Xiaojie Wang, Zhiyuan Liu, Yang Liu
R1,599 Discovery Miles 15 990 Ships in 10 - 15 working days

This book constitutes the proceedings of the 17th China National Conference on Computational Linguistics, CCL 2018, and the 6th International Symposium on Natural Language Processing Based on Naturally Annotated Big Data, NLP-NABD 2018, held in Changsha, China, in October 2018. The 33 full papers presented in this volume were carefully reviewed and selected from 84 submissions. They are organized in topical sections named: Semantics; machine translation; knowledge graph and information extraction; linguistic resource annotation and evaluation; information retrieval and question answering; text classification and summarization; social computing and sentiment analysis; and NLP applications.

Representation Learning for Natural Language Processing (2nd ed. 2023): Zhiyuan Liu, Yan-Kai Lin, Maosong Sun Representation Learning for Natural Language Processing (2nd ed. 2023)
Zhiyuan Liu, Yan-Kai Lin, Maosong Sun
R1,743 Discovery Miles 17 430 Ships in 10 - 15 working days

This book provides an overview of the recent advances in representation learning theory, algorithms, and applications for natural language processing (NLP), ranging from word embeddings to pre-trained language models. It is divided into four parts. Part I presents the representation learning techniques for multiple language entries, including words, sentences and documents, as well as pre-training techniques. Part II then introduces the related representation techniques to NLP, including graphs, cross-modal entries, and robustness. Part III then introduces the representation techniques for the knowledge that are closely related to NLP, including entity-based world knowledge, sememe-based linguistic knowledge, legal domain knowledge and biomedical domain knowledge. Lastly, Part IV discusses the remaining challenges and future research directions. The theories and algorithms of representation learning presented can also benefit other related domains such as machine learning, social network analysis, semantic Web, information retrieval, data mining and computational biology. This book is intended for advanced undergraduate and graduate students, post-doctoral fellows, researchers, lecturers, and industrial engineers, as well as anyone interested in representation learning and natural language processing. As compared to the first edition, the second edition (1) provides a more detailed introduction to representation learning in Chapter 1; (2) adds four new chapters to introduce pre-trained language models, robust representation learning, legal knowledge representation learning and biomedical knowledge representation learning; (3) updates recent advances in representation learning in all chapters; and (4) corrects some errors in the first edition. The new contents will be approximately 50%+ compared to the first edition. This is an open access book.

Representation Learning for Natural Language Processing (2nd ed. 2023): Zhiyuan Liu, Yan-Kai Lin, Maosong Sun Representation Learning for Natural Language Processing (2nd ed. 2023)
Zhiyuan Liu, Yan-Kai Lin, Maosong Sun
R1,090 Discovery Miles 10 900 Ships in 12 - 17 working days

This book provides an overview of the recent advances in representation learning theory, algorithms, and applications for natural language processing (NLP), ranging from word embeddings to pre-trained language models. It is divided into four parts. Part I presents the representation learning techniques for multiple language entries, including words, sentences and documents, as well as pre-training techniques. Part II then introduces the related representation techniques to NLP, including graphs, cross-modal entries, and robustness. Part III then introduces the representation techniques for the knowledge that are closely related to NLP, including entity-based world knowledge, sememe-based linguistic knowledge, legal domain knowledge and biomedical domain knowledge. Lastly, Part IV discusses the remaining challenges and future research directions. The theories and algorithms of representation learning presented can also benefit other related domains such as machine learning, social network analysis, semantic Web, information retrieval, data mining and computational biology. This book is intended for advanced undergraduate and graduate students, post-doctoral fellows, researchers, lecturers, and industrial engineers, as well as anyone interested in representation learning and natural language processing. As compared to the first edition, the second edition (1) provides a more detailed introduction to representation learning in Chapter 1; (2) adds four new chapters to introduce pre-trained language models, robust representation learning, legal knowledge representation learning and biomedical knowledge representation learning; (3) updates recent advances in representation learning in all chapters; and (4) corrects some errors in the first edition. The new contents will be approximately 50%+ compared to the first edition. This is an open access book.

Network Embedding - Theories, Methods, and Applications (Paperback): Cheng Yang, Zhiyuan Liu, Cunchao Tu, Chuan Shi, Maosong Sun Network Embedding - Theories, Methods, and Applications (Paperback)
Cheng Yang, Zhiyuan Liu, Cunchao Tu, Chuan Shi, Maosong Sun
R1,814 Discovery Miles 18 140 Ships in 10 - 15 working days

heterogeneous graphs. Further, the book introduces different applications of NE such as recommendation and information diffusion prediction. Finally, the book concludes the methods and applications and looks forward to the future directions.

Representation Learning for Natural Language Processing (Paperback, 1st ed. 2020): Zhiyuan Liu, Yan-Kai Lin, Maosong Sun Representation Learning for Natural Language Processing (Paperback, 1st ed. 2020)
Zhiyuan Liu, Yan-Kai Lin, Maosong Sun
R1,450 Discovery Miles 14 500 Ships in 10 - 15 working days

This open access book provides an overview of the recent advances in representation learning theory, algorithms and applications for natural language processing (NLP). It is divided into three parts. Part I presents the representation learning techniques for multiple language entries, including words, phrases, sentences and documents. Part II then introduces the representation techniques for those objects that are closely related to NLP, including entity-based world knowledge, sememe-based linguistic knowledge, networks, and cross-modal entries. Lastly, Part III provides open resource tools for representation learning techniques, and discusses the remaining challenges and future research directions. The theories and algorithms of representation learning presented can also benefit other related domains such as machine learning, social network analysis, semantic Web, information retrieval, data mining and computational biology. This book is intended for advanced undergraduate and graduate students, post-doctoral fellows, researchers, lecturers, and industrial engineers, as well as anyone interested in representation learning and natural language processing.

Introduction to Graph Neural Networks (Paperback): Zhiyuan Liu, Jie Zhou Introduction to Graph Neural Networks (Paperback)
Zhiyuan Liu, Jie Zhou
R1,770 Discovery Miles 17 700 Ships in 10 - 15 working days

Graphs are useful data structures in complex real-life applications such as modeling physical systems, learning molecular fingerprints, controlling traffic networks, and recommending friends in social networks. However, these tasks require dealing with non-Euclidean graph data that contains rich relational information between elements and cannot be well handled by traditional deep learning models (e.g., convolutional neural networks (CNNs) or recurrent neural networks (RNNs)). Nodes in graphs usually contain useful feature information that cannot be well addressed in most unsupervised representation learning methods (e.g., network embedding methods). Graph neural networks (GNNs) are proposed to combine the feature information and the graph structure to learn better representations on graphs via feature propagation and aggregation. Due to its convincing performance and high interpretability, GNN has recently become a widely applied graph analysis tool. This book provides a comprehensive introduction to the basic concepts, models, and applications of graph neural networks. It starts with the introduction of the vanilla GNN model. Then several variants of the vanilla model are introduced such as graph convolutional networks, graph recurrent networks, graph attention networks, graph residual networks, and several general frameworks. Variants for different graph types and advanced training methods are also included. As for the applications of GNNs, the book categorizes them into structural, non-structural, and other scenarios, and then it introduces several typical models on solving these tasks. Finally, the closing chapters provide GNN open resources and the outlook of several future directions.

Chinese Computational Linguistics - 18th China National Conference, CCL 2019, Kunming, China, October 18-20, 2019, Proceedings... Chinese Computational Linguistics - 18th China National Conference, CCL 2019, Kunming, China, October 18-20, 2019, Proceedings (Paperback, 1st ed. 2019)
Maosong Sun, Xuanjing Huang, Heng Ji, Zhiyuan Liu, Yang Liu
R1,696 Discovery Miles 16 960 Ships in 10 - 15 working days

This book constitutes the proceedings of the 18th China National Conference on Computational Linguistics, CCL 2019, held in Kunming, China, in October 2019. The 56 full papers presented in this volume were carefully reviewed and selected from 134 submissions. They were organized in topical sections named: linguistics and cognitive science, fundamental theory and methods of computational linguistics, information retrieval and question answering, text classification and summarization, knowledge graph and information extraction, machine translation and multilingual information processing, minority language processing, language resource and evaluation, social computing and sentiment analysis, NLP applications.

Chinese Computational Linguistics and Natural Language Processing Based on Naturally Annotated Big Data - 15th China National... Chinese Computational Linguistics and Natural Language Processing Based on Naturally Annotated Big Data - 15th China National Conference, CCL 2016, and 4th International Symposium, NLP-NABD 2016, Yantai, China, October 15-16, 2016, Proceedings (Paperback, 1st ed. 2016)
Maosong Sun, Xuanjing Huang, Hongfei Lin, Zhiyuan Liu, Yang Liu
R2,987 Discovery Miles 29 870 Ships in 10 - 15 working days

This book constitutes the proceedings of the 15th China National Conference on Computational Linguistics, CCL 2016, and the 4th International Symposium on Natural Language Processing Based on Naturally Annotated Big Data, NLP-NABD 2016, held in Yantai City, China, in October 2016. The 29 full papers and 8 short papers presented in this volume were carefully reviewed and selected from 85 submissions. They were organized in topical sections named: semantics; machine translation; multilinguality in NLP; knowledge graph and information extraction; linguistic resource annotation and evaluation; information retrieval and question answering; text classification and summarization; social computing and sentiment analysis; and NLP applications.

Chinese Computational Linguistics and Natural Language Processing Based on Naturally Annotated Big Data - 14th China National... Chinese Computational Linguistics and Natural Language Processing Based on Naturally Annotated Big Data - 14th China National Conference, CCL 2015 and Third International Symposium, NLP-NABD 2015, Guangzhou, China, November 13-14, 2015, Proceedings (Paperback, 1st ed. 2015)
Maosong Sun, Zhiyuan Liu, Min Zhang, Yang Liu
R2,876 Discovery Miles 28 760 Ships in 10 - 15 working days

This book constitutes the refereed proceedings of the 14th China National Conference on Computational Linguistics, CCL 2014, and of the Third International Symposium on Natural Language Processing Based on Naturally Annotated Big Data, NLP-NABD 2015, held in Guangzhou, China, in November 2015. The 34 papers presented were carefully reviewed and selected from 283 submissions. The papers are organized in topical sections on lexical semantics and ontologies; semantics; sentiment analysis, opinion mining and text classification; machine translation; multilinguality in NLP; machine learning methods for NLP; knowledge graph and information extraction; discourse, coreference and pragmatics; information retrieval and question answering; social computing; NLP applications.

Free Delivery
Pinterest Twitter Facebook Google+
You may like...
Cadac Pizza Stone (33cm)
 (18)
R363 Discovery Miles 3 630
JBL T110 In-Ear Headphones (Black)
 (13)
R229 R201 Discovery Miles 2 010
One Hundred Years Of Dispossession - My…
Lebogang Seale Paperback R320 R235 Discovery Miles 2 350
Wonder Plant Food Stix - Premium Plant…
R49 R41 Discovery Miles 410
Luigi's Mansion 2 HD
R1,299 R1,159 Discovery Miles 11 590
Casio LW-200-7AV Watch with 10-Year…
R999 R884 Discovery Miles 8 840
Alcolin Cold Glue (125ml)
R46 Discovery Miles 460
Professor Snape Wizard Wand - In…
 (8)
R832 Discovery Miles 8 320
Multi Colour Jungle Stripe Neckerchief
R119 Discovery Miles 1 190
Baby Dove Lotion Night Time
R81 Discovery Miles 810

 

Partners