Skip to content

Commit

Permalink
update readme with wechat.
Browse files Browse the repository at this point in the history
  • Loading branch information
shibing624 committed Nov 25, 2019
1 parent 5c9f9e3 commit c0198bf
Show file tree
Hide file tree
Showing 4 changed files with 21,150 additions and 2 deletions.
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -154,7 +154,7 @@ python preprocess.py
python train.py
```
训练过程截图:
![train image](https://github.com/shibing624/pycorrector/blob/master/pycorrector/data/git_image/seq2seq_train.png)
![train image](./docs/git_image/seq2seq_train.png)


#### 预测
Expand Down Expand Up @@ -220,7 +220,7 @@ input: 由我起开始做 output: 由我开始做


## 讨论群
![image](https://user-images.githubusercontent.com/12003487/69050479-17fd9900-0a3d-11ea-8788-7a9898db2380.png)
![image](./docs/git_image/erweima.png)

微信交流群,感兴趣的同学可以加入沟通NLP文本纠错相关技术,issues上回复不及时也可以在群里面提问。

Expand Down
Binary file added docs/git_image/erweima.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
20 changes: 20 additions & 0 deletions pycorrector/data/bert_models/finetuned_chinese_lm/bert_config.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
{
"attention_probs_dropout_prob": 0.1,
"directionality": "bidi",
"hidden_act": "gelu",
"hidden_dropout_prob": 0.1,
"hidden_size": 768,
"initializer_range": 0.02,
"intermediate_size": 3072,
"layer_norm_eps": 1e-12,
"max_position_embeddings": 512,
"num_attention_heads": 12,
"num_hidden_layers": 12,
"pooler_fc_size": 768,
"pooler_num_attention_heads": 12,
"pooler_num_fc_layers": 3,
"pooler_size_per_head": 128,
"pooler_type": "first_token_transform",
"type_vocab_size": 2,
"vocab_size": 21128
}
Loading

0 comments on commit c0198bf

Please sign in to comment.