Skip to content

python codes for iDNA-ABF: multi-scale deep biological language learning model for the accurate and interpretable prediction of DNA methylations

Notifications You must be signed in to change notification settings

FakeEnd/iDNA_ABF

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

14 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

iDNA_ABF

important! Very sorry, some data in the additional file is wrong due to table format problem. We update the new version in the ./additional_file.pdf

Introduction

This repository contains code for "iDNA-ABF: a deep learning sequence modeling framework for DNA methylation prediction".

Here is the article link and webserver link

We will provide a google drive for download dataset and model parameters in the future.

We have provided a base parameters by One drive, you can download by this One drive share

Any question is welcomed to be asked by issue and I will try my best to solve your problems.

Get Started

Thanks to Yingying Yu (She used to be a member of Weilab and now continues her phd life in CityU). She offers a nni version based on pytorchlighting and you can reproduce relevant results by her repository.

basic dictionary

You can change parameters in configuration/config.py to train models.

You can change model structure in model/ClassificationDNAbert.py to train models.

You can change training process and dataset process in frame/ModelManager.py and frame/DataManager.py .

Besides, dataset in paper "iDNA_ABF" is also included in data/DNA_MS.

pretrain model

You should download pretrain model from relevant github repository.

For example, if you want to use DNAbert, you need to put them into the pretrain folder and rename the relevant choice in the model.

Usage

python main/train.py

About

python codes for iDNA-ABF: multi-scale deep biological language learning model for the accurate and interpretable prediction of DNA methylations

Resources

Stars

Watchers

Forks

Packages

No packages published