cover of episode CircSSNN: circRNA-binding site prediction via sequence self-attention neural networks with pre-normalization

CircSSNN: circRNA-binding site prediction via sequence self-attention neural networks with pre-normalization

2023/2/7
logo of podcast PaperPlayer biorxiv bioinformatics

PaperPlayer biorxiv bioinformatics

Shownotes Transcript

Link to bioRxiv paper: http://biorxiv.org/cgi/content/short/2023.02.07.527436v1?rss=1

Authors: Cao, C., Yang, S., Li, M., Li, C.

Abstract: Circular RNAs (circRNAs) play a significant role in some diseases by acting as transcription templates. Therefore, analyzing the interaction mechanism between circRNA and RNA-binding proteins (RBPs) has far-reaching implications for the prevention and treatment of diseases. Existing models for circRNA-RBP identification most adopt CNN, RNN, or their variants as feature extractors. Most of them have drawbacks such as poor parallelism, insufficient stability, and inability to capture long-term dependence. To address these issues, we designed a Seq_transformer module to extract deep semantic features and then propose a CircRNA-RBP identification model based on Sequence Self-attention with Pre-normalization. We test it on 37 circRNA datasets and 31 linear RNA datasets using the same set of hyperparameters, and the overall performance of the proposed model is highly competitive and, in some cases, significantly out-performs state-of-the-art methods. The experimental results indicate that the proposed model is scalable, transformable, and can be applied to a wide range of applications without the need for task-oriented fine-tuning of parameters. The code is available at https://github.com/cc646201081/CircSSNN.

Copy rights belong to original authors. Visit the link for more info

Podcast created by Paper Player, LLC