Skip to content

Commit

Permalink
by default, we use remote login to avoid account restrictions.
Browse files Browse the repository at this point in the history
Signed-off-by: pengzhile <[email protected]>
  • Loading branch information
pengzhile committed Jul 24, 2023
1 parent eaf8522 commit 7aff316
Show file tree
Hide file tree
Showing 13 changed files with 53 additions and 69 deletions.
4 changes: 2 additions & 2 deletions bin/startup.sh
Original file line number Diff line number Diff line change
Expand Up @@ -26,8 +26,8 @@ if [ -n "${PANDORA_API}" ]; then
PANDORA_ARGS="${PANDORA_ARGS} -a"
fi

if [ -n "${PANDORA_SENTRY}" ]; then
PANDORA_ARGS="${PANDORA_ARGS} --sentry"
if [ -n "${PANDORA_LOGIN_LOCAL}" ]; then
PANDORA_ARGS="${PANDORA_ARGS} -l"
fi

if [ -n "${PANDORA_VERBOSE}" ]; then
Expand Down
4 changes: 2 additions & 2 deletions doc/wiki.md
Original file line number Diff line number Diff line change
Expand Up @@ -106,9 +106,9 @@
* `-t``--token_file` 指定一个存放`Access Token`的文件,使用`Access Token`登录。
* `-s``--server``http`服务方式启动,格式:`ip:port`
* `-a``--api` 使用`gpt-3.5-turbo`API请求,**你可能需要向`OpenAI`支付费用**
* `-l``--local` 使用本地环境登录,**你可能需要一个合适的代理IP以避免账号被风控!**
* `--tokens_file` 指定一个存放多`Access Token`的文件,内容为`{"key": "token"}`的形式。
* `--threads` 指定服务启动的线程数,默认为 `8`,Cloud模式为 `4`
* `--sentry` 启用`sentry`框架来发送错误报告供作者查错,敏感信息**不会被发送**
* `-v``--verbose` 显示调试信息,且出错时打印异常堆栈信息,供查错使用。

## Docker环境变量
Expand All @@ -118,7 +118,7 @@
* `PANDORA_PROXY` 指定代理,格式:`protocol://user:pass@ip:port`
* `PANDORA_SERVER``http`服务方式启动,格式:`ip:port`
* `PANDORA_API` 使用`gpt-3.5-turbo`API请求,**你可能需要向`OpenAI`支付费用**
* `PANDORA_SENTRY` 启用`sentry`框架来发送错误报告供作者查错,敏感信息**不会被发送**
* `PANDORA_LOGIN_LOCAL` 使用本地环境登录,**你可能需要一个合适的代理IP以避免账号被风控!**
* `PANDORA_VERBOSE` 显示调试信息,且出错时打印异常堆栈信息,供查错使用。
* `PANDORA_THREADS` 指定服务启动的线程数,默认为 `8`,Cloud模式为 `4`
* 使用Docker方式,设置环境变量即可,无视上述`程序参数`
Expand Down
4 changes: 2 additions & 2 deletions doc/wiki_en.md
Original file line number Diff line number Diff line change
Expand Up @@ -105,9 +105,9 @@ Pandora, talking with ChatGPT in command lines, and with more surprises.
* `-t` or `--token_file` for indicating the file that stores `Access Token`. You will login with access token if this option is in use.
* `-s` or `--server` starts the HTTP server, by which you could open a web page and interact with it in a fancy UI. the value should be`ip:port`.
* `-a` or `--api` use `gpt-3.5-turbo` API in backend. **NOTICE: you will be charged if this option is in use.**
* `-l` or `--local` login using the local environment. **You may need a suitable proxy IP to avoid account restrictions!**
* `--tokens_file` indicating a file storing multiple `Access Token`s. The file content should be like`{"key": "token"}`.
* `--threads` specify the number of server workers, default is `8`, and for cloud mode, it is `4`.
* `--sentry` sending error messages to author for improving Pandora. **Sensitive information won't be leaked.**
* `-v` or `--verbose` for verbose debugging messages.

## Docker
Expand All @@ -119,7 +119,7 @@ These docker environment variables will override start parameters.
* `PANDORA_PROXY` =`protocol://user:pass@ip:port`.
* `PANDORA_SERVER` =`ip:port`.
* `PANDORA_API` for using `gpt-3.5-turbo` API. **NOTICE: you will be charged if this option is in use.**
* `PANDORA_SENTRY` for sending error messages to author to improve Pandora. **Sensitive information won't be leaked.**
* `PANDORA_LOGIN_LOCAL` login using the local environment. **You may need a suitable proxy IP to avoid account restrictions!**
* `PANDORA_VERBOSE` for verbose debugging messages.
* `PANDORA_THREADS` specify the number of server workers, default is `8`, and for cloud mode, it is `4`.

Expand Down
1 change: 0 additions & 1 deletion requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,5 @@ flask[async]~=2.2.3
flask-cors~=3.0.10
waitress~=2.1.2
loguru~=0.6.0
sentry-sdk~=1.17.0
pyjwt[crypto]~=2.6.0
pyperclip~=1.8.2
2 changes: 1 addition & 1 deletion src/pandora/__init__.py
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
# -*- coding: utf-8 -*-

__version__ = '1.2.10'
__version__ = '1.3.0'
6 changes: 1 addition & 5 deletions src/pandora/bots/server.py
Original file line number Diff line number Diff line change
Expand Up @@ -115,11 +115,7 @@ def chat(self, conversation_id=None):
query = {'chatId': [conversation_id]} if conversation_id else {}

token_key = request.args.get('token')
rendered = render_template('chat.html',
pandora_base=request.url_root.strip('/'),
pandora_sentry=self.sentry,
query=query
)
rendered = render_template('chat.html', pandora_base=request.url_root.strip('/'), query=query)
resp = make_response(rendered)

if token_key:
Expand Down
13 changes: 4 additions & 9 deletions src/pandora/cloud_launcher.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,6 @@
from loguru import logger

from . import __version__
from .exts import sentry
from .exts.hooks import hook_except_handle
from .openai.utils import Console

Expand Down Expand Up @@ -48,8 +47,9 @@ def main():
default=4,
)
parser.add_argument(
'--sentry',
help='Enable sentry to send error reports when errors occur.',
'-l',
'--local',
help='Login locally. Pay attention to the risk control of the login ip!',
action='store_true',
)
parser.add_argument(
Expand All @@ -61,13 +61,10 @@ def main():
args, _ = parser.parse_known_args()
__show_verbose = args.verbose

if args.sentry:
sentry.init(args.proxy)

try:
from pandora_cloud.server import ChatBot as CloudServer

return CloudServer(args.proxy, args.verbose, args.sentry, True).run(args.server, args.threads)
return CloudServer(args.proxy, args.verbose, login_local=args.local).run(args.server, args.threads)
except (ImportError, ModuleNotFoundError):
Console.error_bh('### You need `pip install Pandora-ChatGPT[cloud]` to support cloud mode.')

Expand All @@ -82,5 +79,3 @@ def run():

if __show_verbose:
logger.exception('Exception occurred.')

sentry.capture(e)
21 changes: 0 additions & 21 deletions src/pandora/exts/sentry.py

This file was deleted.

Large diffs are not rendered by default.

2 changes: 1 addition & 1 deletion src/pandora/flask/templates/chat.html

Large diffs are not rendered by default.

18 changes: 8 additions & 10 deletions src/pandora/launcher.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,6 @@
from . import __version__
from .bots.legacy import ChatBot as ChatBotLegacy
from .bots.server import ChatBot as ChatBotServer
from .exts import sentry
from .exts.config import USER_CONFIG_DIR, default_api_prefix
from .exts.hooks import hook_except_handle
from .exts.token import check_access_token_out
Expand Down Expand Up @@ -167,8 +166,9 @@ def main():
action='store_true',
)
parser.add_argument(
'--sentry',
help='Enable sentry to send error reports when errors occur.',
'-l',
'--local',
help='Login locally. Pay attention to the risk control of the login ip!',
action='store_true',
)
parser.add_argument(
Expand All @@ -183,9 +183,6 @@ def main():
Console.debug_b(''', Mode: {}, Engine: {}
'''.format('server' if args.server else 'cli', 'turbo' if args.api else 'free'))

if args.sentry:
sentry.init(args.proxy)

if args.api:
try:
from .openai.token import gpt_num_tokens
Expand All @@ -202,11 +199,14 @@ def main():
access_token, need_save = confirm_access_token(args.token_file, args.server, args.api)
if not access_token:
Console.info_b('Please enter your email and password to log in ChatGPT!')
if not args.local:
Console.warn('We login via {}'.format(api_prefix))

email = getenv('OPENAI_EMAIL') or Prompt.ask(' Email')
password = getenv('OPENAI_PASSWORD') or Prompt.ask(' Password', password=True)
mfa = getenv('OPENAI_MFA_CODE') or Prompt.ask(' MFA Code(Optional if not set)')
Console.warn('### Do login, please wait...')
access_token = Auth0(email, password, args.proxy, mfa=mfa).auth(True)
access_token = Auth0(email, password, args.proxy, mfa=mfa).auth(args.local)

if not check_access_token_out(access_token, args.api):
return
Expand All @@ -225,7 +225,7 @@ def main():
chatgpt = ChatGPT(access_tokens, args.proxy)

if args.server:
return ChatBotServer(chatgpt, args.verbose, args.sentry).run(args.server, args.threads)
return ChatBotServer(chatgpt, args.verbose).run(args.server, args.threads)

ChatBotLegacy(chatgpt).run()

Expand All @@ -240,5 +240,3 @@ def run():

if __show_verbose:
logger.exception('Exception occurred.')

sentry.capture(e)
2 changes: 1 addition & 1 deletion src/pandora/openai/api.py
Original file line number Diff line number Diff line change
Expand Up @@ -44,7 +44,7 @@ async def __process_sse(self, resp):
if 'data: [DONE]' == utf8_line[0:12]:
break

if 'data: {"message":' == utf8_line[0:17]:
if 'data: {"message":' == utf8_line[0:17] or 'data: {"id":' == utf8_line[0:12]:
yield json.loads(utf8_line[6:])

@staticmethod
Expand Down
43 changes: 30 additions & 13 deletions src/pandora/openai/auth.py
Original file line number Diff line number Diff line change
Expand Up @@ -38,14 +38,14 @@ def __check_email(email: str):
regex = r'\b[A-Za-z0-9._%+-]+@[A-Za-z0-9.-]+\.[A-Z|a-z]{2,7}\b'
return re.fullmatch(regex, email)

def auth(self, login_local=True) -> str:
def auth(self, login_local=False) -> str:
if self.use_cache and self.access_token and self.expires and self.expires > dt.now():
return self.access_token

if not self.__check_email(self.email) or not self.password:
raise Exception('invalid email or password.')

return self.__part_one()
return self.__part_one() if login_local else self.get_access_token_proxy()

def get_refresh_token(self):
return self.refresh_token
Expand Down Expand Up @@ -189,6 +189,21 @@ def __part_seven(self, code_verifier: str, location: str) -> str:
else:
raise Exception('Error login.')

def __parse_access_token(self, resp):
if resp.status_code == 200:
json = resp.json()
if 'access_token' not in json:
raise Exception('Get access token failed, maybe you need a proxy.')

if 'refresh_token' in json:
self.refresh_token = json['refresh_token']

self.access_token = json['access_token']
self.expires = dt.utcnow() + datetime.timedelta(seconds=json['expires_in']) - datetime.timedelta(minutes=5)
return self.access_token
else:
raise Exception(resp.text)

def get_access_token(self, code_verifier: str, callback_url: str) -> str:
url_params = parse_qs(urlparse(callback_url).query)

Expand All @@ -213,16 +228,18 @@ def get_access_token(self, code_verifier: str, callback_url: str) -> str:
}
resp = self.session.post(url, headers=headers, json=data, allow_redirects=False, **self.req_kwargs)

if resp.status_code == 200:
json = resp.json()
if 'access_token' not in json:
raise Exception('Get access token failed, maybe you need a proxy.')
return self.__parse_access_token(resp)

if 'refresh_token' in json:
self.refresh_token = json['refresh_token']
def get_access_token_proxy(self) -> str:
url = '{}/auth/login'.format(default_api_prefix())
headers = {
'User-Agent': self.user_agent,
}
data = {
'username': self.email,
'password': self.password,
'mfa_code': self.mfa,
}
resp = self.session.post(url=url, headers=headers, data=data, allow_redirects=False, **self.req_kwargs)

self.access_token = json['access_token']
self.expires = dt.utcnow() + datetime.timedelta(seconds=json['expires_in']) - datetime.timedelta(minutes=5)
return self.access_token
else:
raise Exception(resp.text)
return self.__parse_access_token(resp)

0 comments on commit 7aff316

Please sign in to comment.