-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy pathpapers_summary.html
82 lines (70 loc) · 3.22 KB
/
papers_summary.html
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Your Page Title</title>
<style>
body {
background-color: #d1e4dd;
margin: 0;
padding: 0;
font-family: Arial, sans-serif;
}
.row {
display: flex;
justify-content: center;
align-items: center;
/* height: 100vh; Remove or adjust the height as needed */
}
.box {
background-color: white;
padding: 20px;
text-align: center;
box-shadow: 0 4px 8px rgba(0, 0, 0, 0.1);
margin: 20px; /* Add margin between boxes */
}
</style>
</head>
<body>
<div class="row">
<div class="box">
<!-- Content for the first row goes here -->
<h1>2021_Enformer-Effective gene expression prediction from sequence by integrating long-range interactions </h1>
<p>
Nature:IF:60, Cited by 369
</p>
<img src="your_image_url.jpg" alt="Your Image" width="100">
<p>
Check Point: Enformer
- The first to employ the CNN + Transformer hybrid architecture for gene expression level and epigenetic feature prediction.
Achievements:
- Used the same dataset as Basenji247 (an updated version of Basenji).
- Achieved remarkably higher accuracy than its predecessors.
Training Approach:
- Trained by alternately feeding in human and mouse genomic sequences.
- Enabled to perform cross-species inference.
</p>
<a href="https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8490152/pdf/41592_2021_Article_1252.pdf">Effective gene expression prediction from sequence by integrating long-range interactions </a>
</div>
</div>
<div class="row">
<div class="box">
<!-- Content for the second row goes here -->
<h1>To Transformers and Beyond: Large Language Models for the Genome</h1>
<img src="your_image_url.jpg" alt="Your Image" width="100">
<p>ABSTRACT
In the rapidly evolving landscape of genomics, deep learning has emerged as a useful tool for tackling complex computational
challenges. This review focuses on the transformative role of Large Language Models (LLMs), which are mostly based on the
transformer architecture, in genomics. Building on the foundation of traditional convolutional neural networks and recurrent
neural networks, we explore both the strengths and limitations of transformers and other LLMs for genomics. Additionally, we
contemplate the future of genomic modeling beyond the transformer architecture based on current trends in research. The
paper aims to serve as a guide for computational biologists and computer scientists interested in LLMs for genomic data. We
hope the paper can also serve as an educational introduction and discussion for biologists to a fundamental shift in how we will
be analyzing genomic data in the future</p>
<a href="https://arxiv.org/abs/2311.07621">To Transformers and Beyond: Large Language Models for the Genome</a>
</div>
</div>
<!-- Add more rows as needed -->
</body>
</html>