Matches in Nanopublications for { <https://neverblink.eu/ontologies/llm-kg/methods#LiEtAl2024KnowledgeDistillationForTKGRWithLLMs> ?p ?o ?g. }
Showing items 1 to 5 of
5
with 100 items per page.
- LiEtAl2024KnowledgeDistillationForTKGRWithLLMs type Workflow assertion.
- LiEtAl2024KnowledgeDistillationForTKGRWithLLMs label "Knowledge Distillation Framework for Temporal Knowledge Graph Reasoning with Large Language Models" assertion.
- LiEtAl2024KnowledgeDistillationForTKGRWithLLMs comment "This method proposes a two-stage knowledge distillation framework where LLMs act as teacher models to transfer structural and temporal reasoning capabilities to lightweight student models. The goal is to improve the performance and efficiency of temporal knowledge graph reasoning (a KG completion task) by leveraging the advanced reasoning signals from LLMs." assertion.
- LiEtAl2024KnowledgeDistillationForTKGRWithLLMs subject LLMAugmentedKGCompletion assertion.
- LiEtAl2024KnowledgeDistillationForTKGRWithLLMs hasTopCategory LLMAugmentedKG assertion.