πŸ”¬
Minerva AI (EN)
  • πŸ‘‹Welcome to Minerva AI
  • STUDIES & FINDINGS
    • ☝️Abstract
    • πŸ“Introduction and related works
    • πŸ’‘Model overview
      • ♾️Convolutional neural network - CNN
      • ✍️Dropout techniques
      • 🧠Long short-term memory (LSTM)
      • πŸ™†Methodology
      • πŸ”—Data collection and merging
      • πŸ› οΈModel architecture
      • βž•Dataset and Parameter Optimization
      • πŸ‘ŒResults and discussion
      • βš“Trading Philosophy & Method
      • βž—Basic algorithm
      • 🌠Results
    • πŸ‘‰Conclusion
  • PRODUCT DEVELOPMENT
    • πŸ‘€Vision and development roadmap
    • 🌟Revenue models
    • πŸͺ™Tokenomics
  • TEAM
    • πŸ‘₯Founding team
  • RESOURCES
    • πŸ“—References
Powered by GitBook
On this page
  1. STUDIES & FINDINGS
  2. Model overview

Long short-term memory (LSTM)

PreviousDropout techniquesNextMethodology

Last updated 1 year ago

LSTM (Long Short-Term Memory) is a type of Recurrent Neural Network (RNN) that incorporates a feedback loop, unlike other feed-forward networks. LSTM can remember previous states and take them into account when making predictions. This enables capturing context from previous words and sentence structures, as well as handling sequential data such as time series forecasting. LSTM consists of four gates: the input gate, forget gate, modulation gate, and output gate, which help the model process information from previous states and utilize it with the current input to obtain the next state.

In which:

  • 0 < ft, it, ot < 1

  • bf, bi, bo: bias term

πŸ’‘
πŸ§