GLORIA

GEOMAR Library Ocean Research Information Access

feed icon rss

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
  • 1
    Keywords: Computers. ; Optical data processing. ; Application software. ; Electronic books.
    Type of Medium: Online Resource
    Pages: 1 online resource (861 pages)
    Edition: 1st ed.
    ISBN: 9783030884802
    Series Statement: Lecture Notes in Computer Science Series ; v.13028
    DDC: 004
    Language: English
    Note: Intro -- Preface -- Organization -- Contents - Part I -- Contents - Part II -- Oral - Fundamentals of NLP -- Coreference Resolution: Are the Eliminated Spans Totally Worthless? -- 1 Introduction -- 2 Background -- 3 Coreference Resolution with Enhanced Mention Representation -- 3.1 Mention Detection -- 3.2 Coreference Resolving with Global Spans Perceived -- 4 Model Training -- 5 Experimentation -- 5.1 Experimental Settings -- 5.2 Experimental Results -- 5.3 Analysis on Context-Aware Word Representations -- 5.4 Case Study -- 6 Related Work -- 7 Conclusion -- References -- Chinese Macro Discourse Parsing on Dependency Graph Convolutional Network -- 1 Introduction -- 2 Related Work -- 3 Basic Model: MDParser-TS -- 4 Chinese Macro Discourse Parsing on Dependency Graph Convolutional Network -- 4.1 Internal Topic Graph Construction -- 4.2 Interactive Topic Graph Construction -- 4.3 Dependency Graph Convolutional Network -- 4.4 Classifier -- 5 Experimentation -- 5.1 Dataset and Experimental Settings -- 5.2 Baselines -- 5.3 Experimental Results -- 6 Analysis -- 6.1 Analysis on Internal Topic Graph -- 6.2 Analysis on Interactive Topic Graph -- 6.3 Experimentation on English RST-DT -- 7 Conclusion -- References -- Predicting Categorial Sememe for English-Chinese Word Pairs via Representations in Explainable Sememe Space -- 1 Introduction -- 2 Task Formalization -- 3 Methodology -- 3.1 Word Vector Space O and Sememe Space Os -- 3.2 HowNet in Sememe Space Os -- 3.3 Target Data in Sememe Space Os -- 3.4 Training and Prediction -- 4 Experiment -- 4.1 Datasets -- 4.2 Experiment Settings -- 4.3 Overall Results -- 4.4 Results on Different POS Tags -- 4.5 Results on Different Ambiguity Degrees -- 4.6 Effect of Descending Factor c -- 4.7 Effect of Training Set Ratio -- 4.8 Categorial Sememe Knowledge Base -- 5 Related Work -- 6 Conclusion and Future Work. , References -- Multi-level Cohesion Information Modeling for Better Written and Dialogue Discourse Parsing -- 1 Introduction -- 2 Related Work -- 3 Baseline Model -- 3.1 Attention-Based EDU Encoder -- 3.2 Top-Down Baseline Model -- 3.3 Bottom-Up Baseline Model -- 3.4 Deep Sequential Baseline Model -- 4 Cohesion Modeling -- 4.1 Auto Cohesion Information Extraction -- 4.2 Graph Construction -- 4.3 Cohesion Modelling -- 4.4 Fusion Layer -- 5 Experiments -- 5.1 Datasets -- 5.2 Metric -- 5.3 Experimental Result -- 6 Conclusion -- References -- ProPC: A Dataset for In-Domain and Cross-Domain Proposition Classification Tasks -- 1 Introduction -- 2 Dataset Construction -- 2.1 Proposition Definition -- 2.2 Data Acquisition -- 2.3 Data Annotation -- 2.4 Dataset Analysis -- 3 Experiments -- 3.1 Baseline Methods -- 3.2 Experimental Setup -- 3.3 Results and Analysis -- 4 Related Work -- 5 Conclusion -- References -- CTRD: A Chinese Theme-Rheme Discourse Dataset -- 1 Introduction -- 2 Related Work -- 3 Theory Basis -- 3.1 The Theme-Rheme Theory -- 3.2 The Thematic Progression Patterns -- 4 Annotation Scheme -- 4.1 Theme-Rheme Annotation Criteria -- 4.2 Thematic Progression Annotation Criteria -- 5 Statistics -- 6 Experiments and Analysis -- 6.1 Theme-Rheme Automatic Recognition -- 6.2 Function Types Automatic Recognition -- 7 Conclusion -- References -- Machine Translation and Multilinguality -- Learning to Select Relevant Knowledge for Neural Machine Translation -- 1 Introduction -- 2 Our Approach -- 2.1 Problem Definition -- 2.2 Retrieval Stage -- 2.3 Machine Translation via Selective Context -- 2.4 Multi-task Learning Framework -- 3 Evaluation and Datasets -- 3.1 Evaluation -- 3.2 Datasets -- 3.3 Training Details -- 3.4 Baselines -- 3.5 Results -- 4 Analysis -- 5 Related Work -- 6 Conclusion -- References. , Contrastive Learning for Machine Translation Quality Estimation -- 1 Introduction -- 2 Related Work -- 2.1 Machine Translation Quality Estimation -- 2.2 Contrastive Learning -- 3 Our Method -- 3.1 Denoising Reconstructed Samples -- 3.2 Contrastive Training -- 4 Experiments -- 4.1 Setup -- 4.2 Results and Analysis -- 4.3 Different Methods to Create Negative Samples -- 4.4 Compare with Metric-Based Method -- 5 Conclusion -- References -- Sentence-State LSTMs For Sequence-to-Sequence Learning -- 1 Introduction -- 2 Approach -- 2.1 Sentence-State LSTM Encoder -- 2.2 Comparison with RNNs, CNNs and Transformer -- 2.3 LSTM Decoder -- 2.4 Training -- 3 Experiments -- 3.1 Main Results -- 4 Analysis -- 4.1 Ablation Study -- 4.2 Effect of Recurrent Steps -- 5 Related Work -- 5.1 Seq2seq Modeling -- 5.2 Efficient Sequence Encoding -- 6 Conclusion -- References -- Guwen-UNILM: Machine Translation Between Ancient and Modern Chinese Based on Pre-Trained Models -- 1 Introduction -- 2 Related Work -- 3 The Guwen-UNILM Framework -- 3.1 Pre-training Step -- 3.2 Fine-Tuning Step -- 4 Experiment -- 4.1 Datasets -- 4.2 Experimental Setup -- 4.3 Comparative Models -- 4.4 Evaluation Metrics -- 4.5 Results and Discussion -- 5 Conclusion -- References -- Adaptive Transformer for Multilingual Neural Machine Translation -- 1 Introduction -- 2 Related Work -- 3 Background -- 4 Proposed Method -- 4.1 Adaptive Transformer -- 4.2 Adaptive Attention Layer -- 4.3 Adaptive Feed-Forward Layer -- 5 Experiments -- 5.1 Dataset -- 5.2 Model Configurations -- 5.3 Main Results -- 5.4 Ablation Study -- 5.5 Analysis on Shared Rate -- 5.6 Analysis on Low-Resource Language -- 6 Conclusion and Future Work -- References -- Improving Non-autoregressive Machine Translation with Soft-Masking -- 1 Introduction -- 2 Background -- 2.1 Autoregressive Machine Translation. , 2.2 Non-autoregressive Machine Translation -- 3 Method -- 3.1 Encoder -- 3.2 Decoder -- 3.3 Discriminator -- 3.4 Glancing Training -- 4 Experiments -- 4.1 Experiment Settings -- 4.2 Main Results -- 4.3 Decoding Speed -- 5 More Analysis -- 6 Related Works -- 7 Conclusion -- References -- Machine Learning for NLP -- AutoNLU: Architecture Search for Sentence and Cross-sentence Attention Modeling with Re-designed Search Space -- 1 Introduction -- 2 Search Space Design -- 2.1 Meta-architectures -- 2.2 Encoder Operations -- 2.3 Aggregator Search Space -- 2.4 Design Choices -- 3 Architecture Search -- 3.1 Search Algorithm -- 3.2 Child Model Training -- 3.3 Improving Weight Sharing -- 3.4 Search Warm-Up -- 4 Experiments and Discussion -- 4.1 Datasets -- 4.2 Architecture Search Protocols -- 4.3 Results -- 4.4 Ablation on Our Strategies -- 4.5 Ablation on Our Search Space -- 5 Conclusion and Future Work -- References -- AutoTrans: Automating Transformer Design via Reinforced Architecture Search -- 1 Introduction -- 2 Related Work -- 3 Search Space Design -- 4 Architecture Search -- 4.1 Search Algorithm -- 4.2 Deriving Architectures -- 4.3 Cross-operation Parameter Sharing -- 4.4 Cross-layer Parameter Sharing -- 5 Experiments and Results -- 5.1 Datasets -- 5.2 Architecture Search Protocols -- 5.3 Main Results -- 5.4 Effects of Proportions of Training Data -- 5.5 Effects of Different Learning Rates on the Learned Architecture -- 5.6 Effects of Learning Rate on Search -- 6 Conclusions and Discussions -- References -- A Word-Level Method for Generating Adversarial Examples Using Whole-Sentence Information -- 1 Introduction -- 2 Related Work -- 3 Methodology -- 3.1 Selecting Candidate Substitutes -- 3.2 Searching for Adversarial Examples -- 4 Experiments -- 4.1 Setup -- 4.2 Results -- 5 Analysis and Discussions -- 5.1 Ablation Analyses -- 5.2 Effect of Beam Size. , 5.3 Adversarial Training -- 6 Conclusion -- References -- RAST: A Reward Augmented Model for Fine-Grained Sentiment Transfer -- 1 Introduction -- 2 Methodology -- 2.1 Overview -- 2.2 Encoder-Decoder Based Sentiment Transfer Model -- 2.3 Comparative Discriminator -- 2.4 Reward Augmented Training of Sentiment Transfer Model -- 3 Experiments -- 3.1 Experiment Settings -- 3.2 Evaluation Metrics -- 3.3 Results and Analysis -- 3.4 Ablation Study -- 3.5 Case Study -- 4 Related Work -- 5 Conclusion -- References -- Pre-trained Language Models for Tagalog with Multi-source Data -- 1 Introduction -- 2 Related Previous Research -- 2.1 Natural Language Processing for Tagalog -- 2.2 Pre-trained Language Model for Tagalog -- 3 Model -- 3.1 BERT -- 3.2 RoBERTa -- 3.3 ELECTRA -- 4 Pre-training Corpus -- 4.1 Oscar -- 4.2 Wiki -- 4.3 News -- 5 Experiment -- 5.1 Downstream Tasks -- 5.2 Pre-training -- 5.3 Fine-Tuning -- 5.4 Experiment Results and Analysis -- 6 Conclusion -- References -- Accelerating Pretrained Language Model Inference Using Weighted Ensemble Self-distillation -- 1 Introduction -- 2 Weighted Ensemble Self-distillation -- 2.1 Early Exiting -- 2.2 Weighted Ensemble Self-distillation -- 2.3 Adaptive Inference -- 3 Experiments -- 3.1 Datasets and Evaluation Metrics -- 3.2 Baselines -- 3.3 Implementation Details -- 3.4 Comparative Results -- 3.5 Ablation Experiments -- 3.6 The Effect of Weighted Ensemble Self-distillation -- 4 Conclusions -- References -- Information Extraction and Knowledge Graph -- Employing Sentence Compression to Improve Event Coreference Resolution -- 1 Introduction -- 2 Related Work -- 3 Event Coreference Resolution on Sentence Compression -- 3.1 Event Extraction -- 3.2 Event Sentence Compression -- 3.3 Event Coreference Resolution -- 4 Experimentation -- 4.1 Experimental Settings -- 4.2 Results on Event Extraction. , 4.3 Results on Event Coreference Resolution.
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 2
    Keywords: Natural language processing (Computer science)-Congresses. ; Electronic books.
    Type of Medium: Online Resource
    Pages: 1 online resource (647 pages)
    Edition: 1st ed.
    ISBN: 9783030884833
    Series Statement: Lecture Notes in Computer Science Series ; v.13029
    DDC: 006.35
    Language: English
    Note: Intro -- Preface -- Organization -- Contents - Part II -- Contents - Part I -- Posters - Fundamentals of NLP -- Syntax and Coherence - The Effect on Automatic Argument Quality Assessment -- 1 Introduction -- 2 Related Work -- 2.1 Theory Studies -- 2.2 Empirical Methods -- 3 Methodology -- 3.1 Input -- 3.2 Syntax Encoder -- 3.3 Coherence Encoder -- 3.4 Classification -- 4 Experiments -- 4.1 Dataset -- 4.2 Settings -- 4.3 Baselines -- 4.4 Results and Discussions -- 5 Case Study -- 5.1 Syntax Encoder -- 5.2 Coherence Encoder -- 6 Conclusions and Future Work -- References -- ExperienceGen 1.0: A Text Generation Challenge Which Requires Deduction and Induction Ability -- 1 Introduction -- 2 Related Work -- 3 Task Formulation -- 4 Dataset Construction -- 4.1 Causal Sentences and Syllogism -- 4.2 Get Candidate Sentences -- 4.3 Syllogism Reconstruction -- 4.4 Commonsense Knowledge Extraction -- 4.5 Quality Inspection of Dataset -- 4.6 Dataset Statistics -- 5 Experiments and Analysis -- 6 Result -- 6.1 Quantitative Analysis -- 6.2 Qualitative Analysis -- 7 Conclusion -- References -- Machine Translation and Multilinguality -- SynXLM-R: Syntax-Enhanced XLM-R in Translation Quality Estimation -- 1 Introduction -- 2 Related Works -- 3 Methodology -- 3.1 XLM-R -- 3.2 Syntax-Aware Extractor -- 4 Experiments -- 4.1 Data Preparation -- 4.2 Baselines -- 4.3 Training Details -- 4.4 Correlation with DA Scores -- 5 Discussion -- 5.1 Effect of Different Parsers -- 5.2 Attention Heads in GAT -- 5.3 Limitations and Suggestions -- 6 Conclusion -- References -- Machine Learning for NLP -- Memetic Federated Learning for Biomedical Natural Language Processing -- 1 Introduction -- 2 Related Work -- 3 Mem-Fed Framework -- 3.1 Overview -- 3.2 Local Training -- 3.3 Local Searching -- 3.4 Model Aggregation -- 4 Experiments -- 4.1 Experimental Setup. , 4.2 Quantitative Comparison -- 4.3 Group Number in Mem-Fed -- 4.4 Ablation Study on Memetic Aggregation -- 4.5 Local Searching Strategies of Mem-Fed -- 5 Conclusion -- References -- Information Extraction and Knowledge Graph -- Event Argument Extraction via a Distance-Sensitive Graph Convolutional Network -- 1 Introduction -- 2 The DSGCN Model -- 2.1 Word Encoding -- 2.2 Distance-Sensitive Graph Convolutional Network -- 2.3 Task-Specific Pooling -- 2.4 Argument Classification -- 3 Experiments -- 4 Conclusion and Future Work -- References -- Exploit Vague Relation: An Augmented Temporal Relation Corpus and Evaluation -- 1 Introduction -- 2 Related Work -- 3 Data Annotation -- 3.1 Data -- 3.2 Annotation Process -- 3.3 Data Statistics -- 3.4 Annotation Quality -- 4 Temporal Relation Classification and Corpus Usage Strategy -- 4.1 Temporal Relation Classification Model -- 4.2 Corpus Usage Strategy -- 5 Experimentation -- 5.1 Experimental Settings -- 5.2 Experimental Results -- 5.3 Generalization -- 5.4 Error Analysis -- 5.5 Downsampling Training Set -- 6 Conclusion -- References -- Searching Effective Transformer for Seq2Seq Keyphrase Generation -- 1 Introduction -- 2 Methodology -- 2.1 Reduce Attention to Uninformative Content -- 2.2 Relative Multi-head Attention -- 3 Experiment Settings -- 3.1 Notations and Problem Definition -- 3.2 Datasets -- 3.3 Evaluation Metrics -- 3.4 Implementation Details -- 4 Results and Discussions -- 4.1 Applying Transformer to Keyphrase Generation -- 4.2 Tuning Transformer Model -- 4.3 Adapting Transformer to Keyphrase Generation -- 4.4 Observations and Findings -- 5 Related Work -- 6 Conclusion -- References -- Prerequisite Learning with Pre-trained Language and Graph Embedding Models -- 1 Introduction -- 2 Related Work -- 3 Proposed Approach -- 3.1 Text-Based Module -- 3.2 Graph-Based Module. , 3.3 Joint Learning of Two Modules -- 4 Experiments -- 4.1 Datasets -- 4.2 Evaluation Settings -- 4.3 Results -- 5 Conclusion -- References -- Summarization and Generation -- Variational Autoencoder with Interactive Attention for Affective Text Generation -- 1 Introduction -- 2 Variational Autoencoder with Interactive Attention -- 2.1 Encoder -- 2.2 Variational Attention -- 2.3 Decoder -- 3 Experimental Results -- 3.1 Dataset -- 3.2 Evaluation Metrics -- 3.3 Implementation Details -- 3.4 Comparative Results -- 3.5 Ablation Experiment -- 3.6 Case Study -- 4 Conclusions -- References -- CUSTOM: Aspect-Oriented Product Summarization for E-Commerce -- 1 Introduction -- 2 Methodology -- 2.1 CUSTOM: Aspect-Oriented Product Summarization for E-Commerce -- 2.2 SMARTPHONE and COMPUTER -- 2.3 EXT: Extraction-Enhanced Generation Framework -- 3 Experiment -- 3.1 Comparison Methods -- 3.2 Implementation Details -- 3.3 Diversity Evaluation for CUSTOM -- 3.4 Quality Evaluation for EXT -- 3.5 Human Evaluation -- 3.6 Extractor Analysis -- 3.7 Case Study -- 4 Related Work -- 4.1 Product Summarization -- 4.2 Conditional Text Generation -- 5 Conclusion -- References -- Question Answering -- FABERT: A Feature Aggregation BERT-Based Model for Document Reranking -- 1 Introduction -- 2 Related Work -- 3 Model -- 3.1 Problem Definition -- 3.2 QA Pairs Encoder -- 3.3 Feature Aggregation -- 4 Experiment and Results -- 4.1 Dataset and Baselines -- 4.2 Evaluation Metrics -- 4.3 Setting -- 4.4 Results -- 5 Conclusion -- References -- Generating Relevant, Correct and Fluent Answers in Natural Answer Generation -- 1 Introduction -- 2 Related Words -- 2.1 Natural Answer Generation -- 2.2 Text Editing -- 3 Splitting Answering Process into Template Generation and Span Extraction -- 3.1 Span Extraction -- 3.2 Template Generation with Editing -- 3.3 Filling the Template -- 3.4 Training. , 4 Selecting in Candidate Spans -- 4.1 Using Statistic in Training Data -- 4.2 Using Masked Language Model -- 4.3 Final Score -- 5 Experiments -- 5.1 Dataset and Settings -- 5.2 Results -- 5.3 Ablations -- 5.4 Discussion About Candidate Span Selection -- 5.5 Case Study -- 5.6 Advantages Compared to Extracting and Generative Models -- 6 Conclusion -- References -- GeoCQA: A Large-Scale Geography-Domain Chinese Question Answering Dataset from Examination -- 1 Introduction -- 2 Related Work -- 2.1 Machine Reading Comprehension -- 2.2 Open-Domain Question Answering -- 2.3 Comparison with Other Datasets -- 3 Dataset Collection and Analysis -- 3.1 Dataset Collection -- 3.2 Reasoning Types -- 4 Experiments -- 4.1 Rule-Based Method -- 4.2 Neural Models -- 4.3 Experiment Setting -- 4.4 Baseline Results -- 4.5 Error Analysis -- 5 Conclusion -- References -- Dialogue Systems -- Generating Informative Dialogue Responses with Keywords-Guided Networks -- 1 Introduction -- 2 Related Work -- 3 Keywords-Guided Sequence-to-Sequence Model -- 3.1 Context Encoder and Response Decoder -- 3.2 Keywords Decoder and Keywords Encoder -- 3.3 The Cosine Annealing Mechanism -- 3.4 Keywords Acquisition -- 4 Experiments -- 4.1 Experiments Setting -- 4.2 Datasets -- 4.3 Automatic Evaluation -- 4.4 Human Evaluation -- 4.5 The Keywords Ratio -- 5 Conclusion -- A Appendix -- A.1 The Cosine Annealing Mechanism -- A.2 Case Study -- References -- Zero-Shot Deployment for Cross-Lingual Dialogue System -- 1 Introduction -- 2 Problem Definition and Background -- 3 Approach -- 3.1 Pseudo Data Construction -- 3.2 Noise Injection Method -- 3.3 Multi-task Training and Adaptation -- 4 Experiments -- 4.1 Experimental Settings -- 4.2 Experimental Results and Analysis -- 5 Related Work -- 6 Conclusion -- References. , MultiWOZ 2.3: A Multi-domain Task-Oriented Dialogue Dataset Enhanced with Annotation Corrections and Co-Reference Annotation -- 1 Introduction -- 2 Annotation Corrections -- 2.1 Dialogue Act Corrections -- 2.2 Dialogue State Corrections -- 3 Enhance Dataset with Co-Referencing -- 3.1 Annotation for Co-reference in Dialogue -- 3.2 Annotation for Co-reference in User Goal -- 4 Benchmarks and Experimental Results -- 4.1 Dialogue Actions with Natural Language Understanding Benchmarks -- 4.2 Dialogue State Tracking Benchmarks -- 4.3 Experimental Analysis -- 5 Discussion -- 6 Conclusion -- References -- EmoDialoGPT: Enhancing DialoGPT with Emotion -- 1 Introduction -- 2 Related Work -- 3 Methodology -- 3.1 Model Architecture -- 3.2 Input Representation -- 3.3 Emotion Injection -- 3.4 Optimization -- 4 Dataset -- 4.1 Emotion Classifier -- 4.2 Dialogue Dataset with Emotion Labels -- 5 Experiments -- 5.1 Experimental Settings -- 5.2 Baselines -- 5.3 Automatic Evaluation of Emotion Expression -- 5.4 Automatic Evaluation of Response Quality -- 5.5 Human Evaluation -- 5.6 Case Study -- 6 Conclusion and Future Work -- References -- Social Media and Sentiment Analysis -- BERT-Based Meta-Learning Approach with Looking Back for Sentiment Analysis of Literary Book Reviews -- 1 Introduction -- 2 Related Work -- 3 Method -- 3.1 BERT-Based Meta-Learning -- 3.2 Meta-Learning with Looking Back -- 4 Experiments -- 4.1 Dataset -- 4.2 Settings -- 4.3 Result -- 5 Conclusion -- References -- ISWR: An Implicit Sentiment Words Recognition Model Based on Sentiment Propagation -- 1 Introduction -- 2 Related Work -- 2.1 Implicit Sentiment Analysis -- 2.2 Sentiment Propagation -- 3 Implicit Sentiment Words Recognition Based on Sentiment Propagation -- 3.1 Construct Words Graph -- 3.2 Sentiment Propagation in Word Graph -- 4 Experiment and Analysis. , 4.1 Datasets and Evaluation Index.
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 3
    Online Resource
    Online Resource
    Singapore : Springer Singapore | Singapore : Imprint: Springer
    Keywords: Sustainable development. ; Analytical chemistry. ; Agriculture. ; Water. ; Environmental monitoring. ; Sustainability. ; Hydrology. ; Soil science. ; Große Ebene ; Agrarlandschaft ; Bewässerungsfeldbau ; Grundwasserhaushalt ; Grundwasserreserve ; Intensivlandwirtschaft ; China Nord ; Tiefland ; Bewässerung ; Grundwasserentnahme ; Grundwasserabsenkung ; Grundwasserhaushalt ; Hebei ; Henan ; Provinz Tientsin ; Shandong ; Peking Region ; Bewässerungsplanung ; Landwirtschaftliche Nutzfläche ; Wassermangel
    Description / Table of Contents: Chapter 1 Introduction -- Chapter 2 Policy options of over-pumping control in the NCP -- Chapter 3 Cropping choices and farmers’ options -- Chapter 4 Decision support for local water authorities in Guantao County -- Chapter 5 Way forward.
    Type of Medium: Online Resource
    Pages: 1 Online-Ressource(XVIII, 157 p. 98 illus., 89 illus. in color.)
    Edition: 1st ed. 2022.
    ISBN: 9789811658433
    Series Statement: Springer Water
    Language: English
    Note: Open Access
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 4
    Online Resource
    Online Resource
    Singapore : Springer Singapore Pte. Limited
    Keywords: Electronic books
    Description / Table of Contents: Intro -- Foreword -- Acknowledgements -- Contents -- Abbreviations and Units -- Abbreviations -- Units -- 1 Introduction -- 1.1 Groundwater Over-Pumping and Consequences -- 1.2 What Does Sustainable Groundwater Use Mean? -- 1.3 Role of Irrigation in Over-Pumping in NCP -- 1.4 Requirements for Sustainability in NCP and Guantao as an Example -- References -- 2 Policy Options of Over-Pumping Control in the NCP -- 2.1 China's Groundwater Policies in Recent years -- 2.1.1 Permit Policy for Well Drilling -- 2.1.2 Well-Spacing Policy -- 2.1.3 Quota Management -- 2.1.4 Water Resources Fee and Tax -- 2.1.5 Irrigation Water Price Policy -- 2.1.6 Water Rights System and Water Markets -- 2.1.7 National Policy Focus: NCP's Groundwater Over-Pumping -- 2.2 Groundwater Over-Pumping Control Measures in Hebei Province -- 2.2.1 Seasonal Land Fallowing -- 2.2.2 Substitution of Non-food Crops for Grain Crops -- 2.2.3 Replacing Groundwater by Surface Water -- 2.2.4 Buy-Back of Water Rights -- 2.2.5 "Increase Price and Provide Subsidy" -- 2.2.6 Tiered Scheme of Water Fees -- 2.2.7 Import of Surface Water Versus Water Saving and Change of Cropping Structure -- 2.3 Governance Structure in the Water Sector -- 2.3.1 Governmental Stakeholders in Water Sector -- 2.3.2 Stakeholders in the Electricity Sector Related to Irrigation -- 2.3.3 Stakeholders in the Water Sector of Guantao County -- References -- 3 Cropping Choices and Farmers' Options -- 3.1 Options of Optimizing Crop Structure in Hebei-Beijing-Tianjin Region -- 3.1.1 Introduction -- 3.1.2 Optimization Scenarios -- 3.1.3 Scenario Analysis of Planting Structure Optimization -- 3.1.4 Conclusion and Discussion -- 3.2 Farmers' Feedback in a Household Survey on Seasonal Land Fallowing -- 3.2.1 Effects of Seasonal Land Fallowing -- 3.2.2 Challenges of Implementing SLFP.
    Type of Medium: Online Resource
    Pages: 1 online resource (171 pages)
    ISBN: 9789811658433
    Series Statement: Springer Water Ser.
    Language: English
    Note: Description based on publisher supplied metadata and other sources
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 5
    Online Resource
    Online Resource
    Singapore : Springer Nature Singapore | Singapore : Imprint: Springer
    Keywords: Geotechnical engineering. ; Geophysics. ; Cogeneration of electric power and heat. ; Fossil fuels. ; Fluid mechanics. ; Industrial engineering. ; Production engineering. ; Energy policy. ; Energy and state. ; Zentralchina ; Sichuan ; Erdgas ; Carbonatgestein ; Speichergestein ; Petrophysik ; Becken ; Dolomit ; Dolomitkalk ; Dreidimensionales Modell ; Petrophysik ; Gesteinskunde ; Petrophysik ; Erdgaslagerstätte ; Speichergestein ; Bohrkernuntersuchung ; Lithologie ; Gesteinskunde ; Sedimentationsbecken ; Stromatolith ; Tiefenstruktur ; Klastisches Gestein ; Permeabilität ; Porosität
    Description / Table of Contents: Chapter 1: Reservoir Characteristics, Storage and Percolation Capacities of Ultradeep Carbonate Gas Reservoirs -- Chapter 2: Percolation Mechanism of Ultradeep Carbonate Gas Reservoir -- Chapter 3: Stress Sensitivity Characteristics of Ultradeep Carbonate Gas Reservoir -- Chapter 4: Gas Production Characteristics of Ultradeep Carbonate Gas Reservoir.
    Type of Medium: Online Resource
    Pages: 1 Online-Ressource(XV, 325 p. 194 illus., 191 illus. in color.)
    Edition: 1st ed. 2023.
    ISBN: 9789811997082
    Language: English
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 6
    Source: ACS Legacy Archives
    Topics: Chemistry and Pharmacology , Physics
    Type of Medium: Electronic Resource
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 7
    Electronic Resource
    Electronic Resource
    [S.l.] : American Institute of Physics (AIP)
    Journal of Applied Physics 78 (1995), S. 6193-6196 
    ISSN: 1089-7550
    Source: AIP Digital Archive
    Topics: Physics
    Notes: We have fabricated light-emitting nanocrystallites embedded in an a-Si:H matrix using a conventional plasma-enhanced chemical-vapor-deposition system. It was found that the photoluminescence properties are directly related to the deposition parameters. The quantum size effect model is proposed to explain the photoluminescence. Two structural prerequisites are proposed for this kind of films to exhibit effective light emission: One is an upper limit for mean crystallite size of about 3.4 nm, the other is an upper limit for crystallinity of about 30%. © 1995 American Institute of Physics.
    Type of Medium: Electronic Resource
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 8
    ISSN: 1520-4995
    Source: ACS Legacy Archives
    Topics: Biology , Chemistry and Pharmacology
    Type of Medium: Electronic Resource
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 9
    ISSN: 1520-4995
    Source: ACS Legacy Archives
    Topics: Biology , Chemistry and Pharmacology
    Type of Medium: Electronic Resource
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 10
    Electronic Resource
    Electronic Resource
    Woodbury, NY : American Institute of Physics (AIP)
    Applied Physics Letters 69 (1996), S. 596-598 
    ISSN: 1077-3118
    Source: AIP Digital Archive
    Topics: Physics
    Notes: We have observed visible electroluminescence (EL) from silicon nanocrystallites which are embedded in a-Si:H films prepared in a plasma enhanced chemical vapor deposition system. The EL spectra are in the range of 500–850 nm with two peaks located at about 630–680 and 730 nm, respectively. We found that the intensity of EL peaks is related closely to the conductivity of the deposited films. The carrier conduction path is discussed in terms of the material structural characteristics, and a tentative explanation of the light emission mechanism is proposed. © 1996 American Institute of Physics.
    Type of Medium: Electronic Resource
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...