Inviting an author to review:
Find an author and click ‘Invite to review selected article’ near their name.
Search for authorsSearch for similar articles
2
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: not found

      Predicting prime editing efficiency and product purity by deep learning

      research-article

      Read this article at

      ScienceOpenPublisherPMC
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Prime editing is a versatile genome editing tool but requires experimental optimization of the prime editing guide RNA (pegRNA) to achieve high editing efficiency. Here, we conducted a high-throughput screen to analyze prime editing outcomes of 92,423 pegRNAs on a highly diverse set of 13,349 human pathogenic mutations that include base substitutions, insertions and deletions. Based on this dataset, we identified sequence context features that influence prime editing and trained PRIDICT (PRIme editing guide preDICTion), an attention-based bi-directional recurrent neural network. PRIDICT reliably predicts editing rates for all small-sized genetic changes with a Spearman's R of 0.85 and 0.78 for intended and unintended edits, respectively. We validated PRIDICT on endogenous editing sites as well as an external dataset and showed that pegRNAs with high (>70) vs. low (<70) PRIDICT scores showed substantially increased prime editing efficiencies in different cell types in vitro (12-fold) and in hepatocytes in vivo (10-fold), highlighting the value of PRIDICT for basic- and translational research applications.

          Related collections

          Most cited references50

          • Record: found
          • Abstract: not found
          • Article: not found

          Cutadapt removes adapter sequences from high-throughput sequencing reads

            Bookmark
            • Record: found
            • Abstract: not found
            • Conference Proceedings: not found

            Deep Residual Learning for Image Recognition

              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Long Short-Term Memory

              Learning to store information over extended time intervals by recurrent backpropagation takes a very long time, mostly because of insufficient, decaying error backflow. We briefly review Hochreiter's (1991) analysis of this problem, then address it by introducing a novel, efficient, gradient-based method called long short-term memory (LSTM). Truncating the gradient where this does not do harm, LSTM can learn to bridge minimal time lags in excess of 1000 discrete-time steps by enforcing constant error flow through constant error carousels within special units. Multiplicative gate units learn to open and close access to the constant error flow. LSTM is local in space and time; its computational complexity per time step and weight is O(1). Our experiments with artificial data involve local, distributed, real-valued, and noisy pattern representations. In comparisons with real-time recurrent learning, back propagation through time, recurrent cascade correlation, Elman nets, and neural sequence chunking, LSTM leads to many more successful runs, and learns much faster. LSTM also solves complex, artificial long-time-lag tasks that have never been solved by previous recurrent network algorithms.
                Bookmark

                Author and article information

                Journal
                9604648
                Nat Biotechnol
                Nat Biotechnol
                Nature biotechnology
                1087-0156
                1546-1696
                16 January 2023
                16 January 2023
                16 November 2022
                11 August 2023
                : 10.1038/s41587-022-01613-7
                Affiliations
                [1 ]Institute of Pharmacology and Toxicology, University of Zurich, Zurich, Switzerland
                [2 ]Department of Quantitative Biomedicine, University of Zurich, Zurich, Switzerland
                [3 ]Institute of Molecular Health Sciences, ETH Zurich, Zurich, Switzerland
                Author notes
                Article
                EMS157167
                10.1038/s41587-022-01613-7
                7614945
                36646933
                2efcaf8d-a373-4ae1-bb2b-0e3ead5017ff

                Users may view, print, copy, and download text and data-mine the content in such documents, for the purposes of academic research, subject always to the full Conditions of use: https://www.springernature.com/gp/open-research/policies/accepted-manuscript-terms

                History
                Categories
                Article

                Biotechnology
                Biotechnology

                Comments

                Comment on this article