Hao Zhang
Orcid: 0000-0003-0877-2681Affiliations:
- University of Science and Technology of China, State Key Laboratory of Cognitive Intelligence, Hefei, China
According to our database1,
Hao Zhang
authored at least 14 papers
between 2023 and 2025.
Collaborative distances:
Collaborative distances:
Timeline
Legend:
Book In proceedings Article PhD thesis Dataset OtherLinks
Online presence:
-
on orcid.org
On csauthors.net:
Bibliography
2025
Cross-Domain Pre-training with Language Models for Transferable Time Series Representations.
Proceedings of the Eighteenth ACM International Conference on Web Search and Data Mining, 2025
2024
Molar: Multimodal LLMs with Collaborative Filtering Alignment for Enhanced Sequential Recommendation.
CoRR, 2024
Pre-trained Language Model and Knowledge Distillation for Lightweight Sequential Recommendation.
CoRR, 2024
Learning Transferable Time Series Classifier with Cross-Domain Pre-training from Language Model.
CoRR, 2024
Towards Personalized Evaluation of Large Language Models with An Anonymous Crowd-Sourcing Platform.
Proceedings of the Companion Proceedings of the ACM on Web Conference 2024, 2024
Optimizing Code Retrieval: High-Quality and Scalable Dataset Annotation through Large Language Models.
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing, 2024
Proceedings of the Database Systems for Advanced Applications, 2024
Proceedings of the Database Systems for Advanced Applications, 2024
Reformulating Sequential Recommendation: Learning Dynamic User Interest with Content-enriched Language Modeling.
Proceedings of the Database Systems for Advanced Applications, 2024
Empowering Sequential Recommendation from Collaborative Signals and Semantic Relatedness.
Proceedings of the Database Systems for Advanced Applications, 2024
Learning the Dynamics in Sequential Recommendation by Exploiting Real-time Information.
Proceedings of the 33rd ACM International Conference on Information and Knowledge Management, 2024
2023
CoRR, 2023
CoRR, 2023
TimeMAE: Self-Supervised Representations of Time Series with Decoupled Masked Autoencoders.
CoRR, 2023