CSpace

浏览/检索结果: 共20条,第1-10条 帮助

已选(0)清除 条数/页:   排序方式:
Low-redundancy distillation for continual learning 期刊论文
PATTERN RECOGNITION, 2025, 卷号: 167, 页码: 12
作者:  Liu, Ruiqi;  Diao, Boyu;  Huang, Libo;  An, Zijia;  Liu, Hangda;  An, Zhulin;  Xu, Yongjun
收藏  |  浏览/下载:10/0  |  提交时间:2025/12/03
Continual learning  Lifelong learning  Catastrophic forgetting  Knowledge distillation  Experience replay  
Peak-controlled logits poisoning attack in federated distillation 期刊论文
DISCOVER COMPUTING, 2025, 卷号: 28, 期号: 1, 页码: 18
作者:  Tang, Yuhan;  Wu, Zhiyuan;  Gao, Bo;  Wen, Tian;  Wang, Yuwei;  Sun, Sheng
收藏  |  浏览/下载:4/0  |  提交时间:2025/12/03
Federated learning  Knowledge distillation  Knowledge transfer  Poisoning attack  Misleading attack  
DMutDE: Dual-View Mutual Distillation Framework for Knowledge Graph Embeddings 期刊论文
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2025, 页码: 14
作者:  Liu, Ruizhou;  Wu, Zhe;  Wu, Yiling;  Cao, Zongsheng;  Xu, Qianqian;  Huang, Qingming
收藏  |  浏览/下载:4/0  |  提交时间:2025/12/03
Training  Data models  Predictive models  Cognition  Knowledge graphs  Electronic mail  Translation  Semantics  Noise  Costs  Knowledge distillation (KD)  knowledge graph (KG)  knowledge graph embedding (KGE)  
Exploiting user comments for early detection of fake news prior to users' commenting 期刊论文
FRONTIERS OF COMPUTER SCIENCE, 2025, 卷号: 19, 期号: 10, 页码: 13
作者:  Nan, Qiong;  Sheng, Qiang;  Cao, Juan;  Zhu, Yongchun;  Wang, Danding;  Yang, Guang;  Li, Jintao
收藏  |  浏览/下载:24/0  |  提交时间:2025/06/25
fake news detection  knowledge distillation  early detection  
Efficient Distillation Using Channel Pruning for Point Cloud-Based 3D Object Detection 期刊论文
IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2025, 页码: 15
作者:  Li, Fuyang;  Min, Chen;  Wang, Juan;  Xiao, Liang;  Zhao, Dawei;  Nie, Yiming;  Dai, Bin
收藏  |  浏览/下载:4/0  |  提交时间:2025/12/03
Knowledge distillation  3D object detection  point cloud  point cloud  network pruning  network pruning  autonomous driving  autonomous driving  autonomous driving  
Learning Critically: Selective Self-Distillation in Federated Learning on Non-IID Data 期刊论文
IEEE TRANSACTIONS ON BIG DATA, 2024, 卷号: 10, 期号: 6, 页码: 789-800
作者:  He, Yuting;  Chen, Yiqiang;  Yang, XiaoDong;  Yu, Hanchao;  Huang, Yi-Hua;  Gu, Yang
收藏  |  浏览/下载:34/0  |  提交时间:2024/12/06
Data models  Training  Servers  Collaborative work  Adaptation models  Convergence  Feature extraction  Federated learning  knowledge distillation  non-identically distributed  deep learning  catastrophic forgetting  
Toward Generalized Multistage Clustering: Multiview Self-Distillation 期刊论文
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 页码: 14
作者:  Wang, Jiatai;  Xu, Zhiwei;  Wang, Xin;  Li, Tao
收藏  |  浏览/下载:56/0  |  提交时间:2025/06/25
Semantics  Feature extraction  Contrastive learning  Mutual information  Clustering methods  Representation learning  Computational modeling  Training  Knowledge engineering  Predictive models  Hierarchical contrastive learning  multistage clustering  multiview self-distillation  mutual information between views  
FedCache: A Knowledge Cache-Driven Federated Learning Architecture for Personalized Edge Intelligence 期刊论文
IEEE TRANSACTIONS ON MOBILE COMPUTING, 2024, 卷号: 23, 期号: 10, 页码: 9368-9382
作者:  Wu, Zhiyuan;  Sun, Sheng;  Wang, Yuwei;  Liu, Min;  Xu, Ke;  Wang, Wen;  Jiang, Xuefeng;  Gao, Bo;  Lu, Jinda
收藏  |  浏览/下载:35/0  |  提交时间:2024/12/06
Computer architecture  Training  Servers  Computational modeling  Data models  Adaptation models  Performance evaluation  Distributed architecture  edge computing  personalized federated learning  knowledge distillation  communication efficiency  
Open-category referring expression comprehension via multi-modal knowledge transfer 期刊论文
NEUROCOMPUTING, 2024, 卷号: 598, 页码: 10
作者:  Mi, Wenyu;  Wang, Jianji;  Zhuang, Fuzhen;  An, Zhulin;  Guo, Wei
收藏  |  浏览/下载:14/0  |  提交时间:2025/06/25
Referring expression comprehension  CLIP  Open-category  Knowledge distillation  
PDD: Pruning Neural Networks During Knowledge Distillation 期刊论文
COGNITIVE COMPUTATION, 2024, 页码: 11
作者:  Dan, Xi;  Yang, Wenjie;  Zhang, Fuyan;  Zhou, Yihang;  Yu, Zhuojun;  Qiu, Zhen;  Zhao, Boyuan;  Dong, Zeyu;  Huang, Libo;  Yang, Chuanguang
收藏  |  浏览/下载:40/0  |  提交时间:2024/12/06
Knowledge distillation  Model pruning  Model compression