Matches in Nanopublications for { ?s ?p ?o <https://w3id.org/np/RA0sIJ5Jd9k6bKVHFBeMQ4GWCdHreIplnPjNgeTxXNQFg/assertion>. }
Showing items 1 to 21 of
21
with 100 items per page.
- CSPromKG type Workflow assertion.
- KGS2S type Workflow assertion.
- KGBERT type Workflow assertion.
- arXiv.2402.01729 type Entity assertion.
- ContextualizationDistillation type Workflow assertion.
- GenKGC type Workflow assertion.
- CSPromKG label "CSProm-KG" assertion.
- KGS2S label "KG-S2S" assertion.
- KGBERT label "KG-BERT" assertion.
- ContextualizationDistillation label "Contextualization Distillation" assertion.
- GenKGC label "GenKGC" assertion.
- ContextualizationDistillation comment "Contextualization Distillation leverages LLMs to generate high-quality, context-rich descriptions for KG triplets. These LLM-generated contexts are then used in auxiliary tasks (reconstruction or contextualization) to train smaller KGC models, thereby enhancing their performance on Knowledge Graph Completion. The core idea is that LLMs augment the training data/signal for the KG completion task." assertion.
- arXiv.2402.01729 describes ContextualizationDistillation assertion.
- arXiv.2402.01729 discusses CSPromKG assertion.
- arXiv.2402.01729 discusses KGS2S assertion.
- arXiv.2402.01729 discusses KGBERT assertion.
- arXiv.2402.01729 discusses GenKGC assertion.
- ContextualizationDistillation subject LLMAugmentedKGCompletion assertion.
- arXiv.2402.01729 title "Contextualization Distillation from Large Language Model for Knowledge Graph Completion" assertion.
- ContextualizationDistillation hasTopCategory LLMAugmentedKG assertion.
- ContextualizationDistillation hasURL "https://github.com/Li0406/Contextulization-Distillation" assertion.