Matches in Nanopublications for { <https://neverblink.eu/ontologies/llm-kg/methods#ContextualizationDistillation> ?p ?o ?g. }
Showing items 1 to 6 of
6
with 100 items per page.
- ContextualizationDistillation type Workflow assertion.
- ContextualizationDistillation label "Contextualization Distillation" assertion.
- ContextualizationDistillation comment "Contextualization Distillation leverages LLMs to generate high-quality, context-rich descriptions for KG triplets. These LLM-generated contexts are then used in auxiliary tasks (reconstruction or contextualization) to train smaller KGC models, thereby enhancing their performance on Knowledge Graph Completion. The core idea is that LLMs augment the training data/signal for the KG completion task." assertion.
- ContextualizationDistillation subject LLMAugmentedKGCompletion assertion.
- ContextualizationDistillation hasTopCategory LLMAugmentedKG assertion.
- ContextualizationDistillation hasURL "https://github.com/Li0406/Contextulization-Distillation" assertion.