Xuelei Li

Orcid: 0000-0002-7935-6290

According to our database1, Xuelei Li authored at least 28 papers between 2012 and 2024.

Collaborative distances:

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

On csauthors.net:

Bibliography

2024
First Place Solution to the ECCV 2024 ROAD++ Challenge @ ROAD++ Atomic Activity Recognition 2024.
CoRR, 2024

An In-Depth Assessment of Sequence Clustering Software in Bioinformatics.
Proceedings of the Bioinformatics Research and Applications - 20th International Symposium, 2024

2023
Strong generalization in quantum neural networks.
Quantum Inf. Process., December, 2023

SWsnn: A Novel Simulator for Spiking Neural Networks.
J. Comput. Biol., September, 2023

PCPI: Prediction of circRNA and Protein Interaction Using Machine Learning Method.
Proceedings of the Bioinformatics Research and Applications - 19th International Symposium, 2023

Outsourcing the Computation of Plaintext Encryption for Homomorphic Encryption.
Proceedings of the 8th International Conference on Computer and Communication Systems, 2023

2022
Born Scattering Integral, Scattering Radiation Pattern, and Generalized Radon Transform Inversion in Acoustic Tilted Transversely Isotropic Media.
IEEE Trans. Geosci. Remote. Sens., 2022

Compact Multiple Attribute-Based Signatures With Key Aggregation and Its Application.
IEEE Syst. J., 2022

An Efficient Certificateless Ring Signcryption Scheme With Conditional Privacy-Preserving in VANETs.
J. Syst. Archit., 2022

nGIA: A novel Greedy Incremental Alignment based algorithm for gene sequence clustering.
Future Gener. Comput. Syst., 2022

Online Decentralized Task Allocation Optimization for Edge Collaborative Networks.
Proceedings of the IEEE Symposium on Computers and Communications, 2022

Simulating Spiking Neural Networks Based on SW26010pro.
Proceedings of the Bioinformatics Research and Applications - 18th International Symposium, 2022

Privacy Preserving Electronic Scoring Scheme based on CKKS.
Proceedings of the 22nd IEEE International Conference on Communication Technology, 2022

2021
Structural Characteristics of Moho Surface Based on Time Series Function of Natural Earthquakes.
Remote. Sens., 2021

An improved model training method for residual convolutional neural networks in deep learning.
Multim. Tools Appl., 2021

An Efficient Greedy Incremental Sequence Clustering Algorithm.
Proceedings of the Bioinformatics Research and Applications - 17th International Symposium, 2021

2019
Characteristics of a Magnetic Field Sensor with a Concentrating-Conducting Magnetic Flux Structure.
Sensors, 2019

The Quantum Shor Algorithm Simulated on FPGA.
Proceedings of the 2019 IEEE Intl Conf on Parallel & Distributed Processing with Applications, 2019

A Deep Residual Networks Accelerator on FPGA.
Proceedings of the Eleventh International Conference on Advanced Computational Intelligence, 2019

2018
Attribute-based fuzzy identity access control in multicloud computing environments.
Soft Comput., 2018

An OpenCLTM Implementation of WebP Accelerator on FPGAs.
Proceedings of the Applied Reconfigurable Computing. Architectures, Tools, and Applications, 2018

2017
Flexible CP-ABE Based Access Control on Encrypted Data for Mobile Users in Hybrid Cloud System.
J. Comput. Sci. Technol., 2017

A multi-frequency receiver function inversion approach for crustal velocity structure.
Comput. Geosci., 2017

2016
A Three-Factor Based Remote User Authentication Scheme: Strengthening Systematic Security and Personal Privacy for Wireless Communications.
Wirel. Pers. Commun., 2016

A Novel Protocol-Feature Attack against Tor's Hidden Service.
IEICE Trans. Inf. Syst., 2016

2014
Secure Privacy-Preserving Biometric Authentication Scheme for Telecare Medicine Information Systems.
J. Medical Syst., 2014

2013
An improved authentication with key agreement scheme on elliptic curve cryptosystem for global mobility networks.
Int. J. Netw. Manag., 2013

2012
An improved dynamic ID-based remote user authentication with key agreement scheme.
Comput. Electr. Eng., 2012


  Loading...