Skip to content
/ FinGPT Public template
forked from AI4Finance-Foundation/FinGPT

Data-Centric FinGPT. Open-source for open finance! Revolutionize 🔥 We release the trained model on HuggingFace.

License

Notifications You must be signed in to change notification settings

MavonCmc/FinGPT

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Data-Centric FinGPT: Open-source for Open Finance.

Downloads Downloads Python 3.8 PyPI License

Let us DO NOT expect Wall Street to open-source LLMs nor open APIs, due to FinTech institutes' internal regulations and policies.

We democratize Internet-scale data for financial large language models (FinLLMs) at FinNLP and FinNLP Website

Blueprint of FinGPT

Disclaimer: We are sharing codes for academic purposes under the MIT education license. Nothing herein is financial advice, and NOT a recommendation to trade real money. Please use common sense and always first consult a professional before trading or investing.

Why FinGPT?

1). Finance is highly dynamic. BloombergGPT trained an LLM using a mixture of finance data and general-purpose data, which too about 53 days, at a cost of around $3M). It is costly to retrain an LLM model like BloombergGPT every month or every week, thus lightweight adaptation is highly favorable. FinGPT can be fine-tuned swiftly to incorporate new data (the cost falls significantly, less than $300 per fine-tuning).

2). Democratizing Internet-scale financial data is critical, say allowing timely updates of the model (monthly or weekly updates) using an automatic data curation pipeline. BloombergGPT has privileged data access and APIs, while FinGPT presents a more accessible alternative. It prioritizes lightweight adaptation, leveraging the best available open-source LLMs.

3). The key technology is "RLHF (Reinforcement learning from human feedback)", which is missing in BloombergGPT. RLHF enables an LLM model to learn individual preferences (risk-aversion level, investing habits, personalized robo-advisor, etc.), which is the "secret" ingredient of ChatGPT and GPT4.

FinGPT Demos

  • FinGPT V3 (Updated on 8/4/2023)

    • FinGPT v3 series are LLMs finetuned with LoRA method on the News and Tweets sentiment analysis dataset which achieve best scores on most of the financial sentiment analysis datasets.
    • FinGPT v3.1 uses chatglm2-6B as base model; FinGPT v3.2 uses llama2-7b as base model
    • Benchmark Results:
      Weighted F1 BloombergGPT ChatGLM2 Llama2 FinGPT v3.1 v3.1.1 (8bit) v3.1.2 (QLoRA) FinGPT v3.2
      FPB 0.511 0.381 0.390 0.855 0.855 0.777 0.850
      FiQA-SA 0.751 0.790 0.800 0.850 0.847 0.752 0.860
      TFNS - 0.189 0.296 0.875 0.879 0.828 0.894
      NWGI - 0.449 0.503 0.642 0.632 0.583 0.636
      Devices 512 × A100 64 × A100 2048 × A100 8 × A100 A100 A100 8 × A100
      Time 53 days 2.5 days 21 days 2 hours 6.47 hours 4.15 hours 2 hours
      Cost $2.67 million $ 14,976 $4.23 million $65.6 $25.88 $17.01 $65.6

Cost per GPU hour. For A100 GPUs, the AWS p4d.24xlarge instance, equipped with 8 A100 GPUs is used as a benchmark to estimate the costs. Note that BloombergGPT also used p4d.24xlarge As of July 11, 2023, the hourly rate for this instance stands at $32.773. Consequently, the estimated cost per GPU hour comes to $32.77 divided by 8, resulting in approximately $4.10. With this value as the reference unit price (1 GPU hour). BloombergGPT estimated cost= 512 x 53 x 24 = 651,264 GPU hours x $4.10 = $2,670,182.40

  • Reproduce the results by running benchmarks, and the detailed tutorial is on the way.

  • Finetune your own FinGPT v3 model with the LoRA method on only an RTX 3090 with this notebook in 8bit or this notebook in int4 (QLoRA)

  • FinGPT V2

    • Let's train our own FinGPT in American Financial Market with LLaMA and LoRA (Low-Rank Adaptation)
  • FinGPT V1

    • Let's train our own FinGPT in Chinese Financial Market with ChatGLM and LoRA (Low-Rank Adaptation)

Understanding FinGPT: An Educational Blog Series

What is FinGPT and FinNLP?

The Goals of FinGPT

  1. Real-time data curation pipeline to democratize data for FinGPT
  2. Lightweight adaptation to democratize the FinGPT model for both individuals and institutes (frequent updates)
  3. Support various financial applications
  • FinNLP provides a playground for all people interested in LLMs and NLP in Finance. Here we provide full pipelines for LLM training and finetuning in the field of finance. The full architecture is shown in the following picture. Detail codes and introductions can be found here. Or you may refer to the wiki

End-to-end framework: FinGPT embraces a full-stack framework for FinLLMs with four layers:

  • Data source layer: This layer assures comprehensive market coverage, addressing the temporal sensitivity of financial data through real-time information capture.
  • Data engineering layer: Primed for real-time NLP data processing, this layer tackles the inherent challenges of high temporal sensitivity and low signal-to-noise ratio in financial data.
  • LLMs layer: Focusing on a range of fine-tuning methodologies such as LoRA, this layer mitigates the highly dynamic nature of financial data, ensuring the model’s relevance and accuracy.
  • Application layer: Showcasing practical applications and demos, this layer highlights the potential capability of FinGPT in the financial sector.

Open-Source Base Model used in the LLMs layer of FinGPT

  • Feel free to contribute more open-source base models tailored for various language-specific financial markets.
Base Model Pretraining Tokens Context Length Model Advantages Model Size Experiment Results Applications
Llama-2 2 Trillion 4096 Llama-2 excels on English-based market data llama-2-7b and Llama-2-13b llama-2 consistently shows superior fine-tuning results. Financial Sentiment Analysis, Robo-Advisor
Falcon 1,500B 2048 Maintains high-quality results while being more resource-efficient falcon-7b Good for English market data Financial Sentiment Analysis
ChatGLM2 1.4T 32K Potent Chinese language expression capability chatglm2-6b Good for Chinese market data Financial Sentiment Analysis, Financial Report Summary
InternLM 1.8T 8k Flexibly and independently construct workflows internlm-7b Good for Chinese market data Financial Sentiment Analysis

News

ChatGPT at AI4Finance

Introductory

The Journey of Open AI GPT models. GPT models explained. Open AI's GPT-1, GPT-2, GPT-3.

(Financial) Big Data

Interesting Demos

  • GPT-3 Creative Fiction Creative writing by OpenAI’s GPT-3 model, demonstrating poetry, dialogue, puns, literary parodies, and storytelling. Plus advice on effective GPT-3 prompt programming & avoiding common errors.

ChatGPT for FinTech

ChatGPT Trading Bot

Links

About

Data-Centric FinGPT. Open-source for open finance! Revolutionize 🔥 We release the trained model on HuggingFace.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 71.1%
  • Python 27.8%
  • Other 1.1%