Matches in Nanopublications for { ?s ?p ?o <https://w3id.org/np/RA6OLbvx3KxmTsgne91YIoGYUaAQc2Y1rhoyB4HrjFyN0/assertion>. }
Showing items 1 to 23 of
23
with 100 items per page.
- arXiv.2601.00202 type Entity assertion.
- BKD type Workflow assertion.
- FitNet type Workflow assertion.
- LiEtAl2024KnowledgeDistillationForTKGRWithLLMs type Workflow assertion.
- RKD type Workflow assertion.
- TADistMult type Workflow assertion.
- TTransE type Workflow assertion.
- BKD label "Classical Knowledge Distillation (BKD)" assertion.
- FitNet label "FitNet" assertion.
- LiEtAl2024KnowledgeDistillationForTKGRWithLLMs label "Knowledge Distillation Framework for Temporal Knowledge Graph Reasoning with Large Language Models" assertion.
- RKD label "Relational Knowledge Distillation (RKD)" assertion.
- TADistMult label "Temporal Attention DistMult (TADistMult)" assertion.
- TTransE label "Temporal TransE (TTransE)" assertion.
- LiEtAl2024KnowledgeDistillationForTKGRWithLLMs comment "This method proposes a two-stage knowledge distillation framework where LLMs act as teacher models to transfer structural and temporal reasoning capabilities to lightweight student models. The goal is to improve the performance and efficiency of temporal knowledge graph reasoning (a KG completion task) by leveraging the advanced reasoning signals from LLMs." assertion.
- arXiv.2601.00202 describes LiEtAl2024KnowledgeDistillationForTKGRWithLLMs assertion.
- arXiv.2601.00202 discusses BKD assertion.
- arXiv.2601.00202 discusses FitNet assertion.
- arXiv.2601.00202 discusses RKD assertion.
- arXiv.2601.00202 discusses TADistMult assertion.
- arXiv.2601.00202 discusses TTransE assertion.
- LiEtAl2024KnowledgeDistillationForTKGRWithLLMs subject LLMAugmentedKGCompletion assertion.
- arXiv.2601.00202 title "Knowledge Distillation for Temporal Knowledge Graph Reasoning with Large Language Models" assertion.
- LiEtAl2024KnowledgeDistillationForTKGRWithLLMs hasTopCategory LLMAugmentedKG assertion.