- Access basic ChatGPT model without authentication
OpenAI recently made the ChatGPT model accessible anonymously. You can access it without authentication. Read more about it in the official docs
- Install the package/cli
pip install anonymous-chatgpt
chatgpt --prompt "hello world"
OR
chatgpt --chat
from anonymous_chatgpt import chat_prompt, ChatGPT
message = chat_promt(prompt="hello world")
print(message)
# For chat
chatgpt = ChatGPT()
resp1 = chatgpt.chat(prompt="hello, my name is John")
resp2 = chatgpt.chat(prompt="what is my name?")
- Send the first request to the ChatGPT API i.e.
chat.openai.com
- The response of that request has cookies for authentication(not user just user-agent and csrf tokens)
- Those cookies are carried in all the upcoming requests
- Send the second request to the Sential API i.e.
/backend-anon/sentinel/chat-requirements
- This request gives us the
sentinel-token
which is the token used for authentication and authorization of the requests (not users). - We use this token in all the subsequent requests
- The third request is the actual request to the Anonymous Conversation endpoint i.e.
/backend-anon/conversation
- This is a streamed request which returns the response in the form of chunks