-
Notifications
You must be signed in to change notification settings - Fork 415
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix issue#85 #86
Fix issue#85 #86
Conversation
Signed-off-by: Anhforth <[email protected]>
Signed-off-by: Anhforth <[email protected]>
Signed-off-by: Anhforth <[email protected]>
* clean codes Co-authored-by: Zac Liu <[email protected]>
* fix bert tokenizer issue * updated t5, opt and roberta tokenizers * fixed doc 404 error Signed-off-by: ZhaodongYan1 <[email protected]>
Signed-off-by: ZhaodongYan1 <[email protected]>
Signed-off-by: Anhforth <[email protected]>
Signed-off-by: Anhforth <[email protected]>
Signed-off-by: ZhaodongYan1 <[email protected]>
Signed-off-by: Anhforth <[email protected]>
* autoloader for opt * opt-66b inference * Update train.py * Load data from example dir * add readme of multi GPU inference Co-authored-by: Zac Liu <[email protected]>
Signed-off-by: Anhforth <[email protected]>
Signed-off-by: Anhforth <[email protected]>
Signed-off-by: Anhforth <[email protected]>
Signed-off-by: Anhforth <[email protected]>
Signed-off-by: Anhforth <[email protected]>
Signed-off-by: Anhforth <[email protected]>
Signed-off-by: Anhforth <[email protected]>
Signed-off-by: Anhforth <[email protected]>
Signed-off-by: Anhforth <[email protected]>
Signed-off-by: Anhforth <[email protected]>
fix bug multi_gpu_training
Signed-off-by: Anhforth <[email protected]>
Signed-off-by: Anhforth <[email protected]>
Signed-off-by: Anhforth <[email protected]>
Signed-off-by: Anhforth <[email protected]>
fix bugs in bert forward with different shape of attention mask
@@ -168,12 +168,12 @@ def predict_ner(self, | |||
model.eval() | |||
device = next(model.parameters()).device | |||
tokenizer = self.tokenizer | |||
tokens = tokenizer.tokenize(text) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
check
Signed-off-by: Anhforth <[email protected]>
Signed-off-by: Anhforth <[email protected]>
Signed-off-by: Anhforth <[email protected]>
Signed-off-by: Anhforth <[email protected]>
Signed-off-by: Anhforth <[email protected]>
Signed-off-by: Anhforth <[email protected]>
Signed-off-by: Anhforth <[email protected]>
Signed-off-by: Anhforth <[email protected]>
Signed-off-by: Anhforth <[email protected]>
return token_mapping | ||
|
||
@staticmethod | ||
def _is_control(ch): | ||
"""控制类字符判断 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
comments
@@ -47,23 +47,67 @@ def from_pretrain(cls, | |||
device="cpu", | |||
**kwargs): | |||
model_id = None | |||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
add comments
add parameters for load_local
add assert for config_path & checkpoint_path
Signed-off-by: Anhforth <[email protected]>
Signed-off-by: Anhforth <[email protected]>
Signed-off-by: Anhforth <[email protected]>
Signed-off-by: Anhforth <[email protected]>
Signed-off-by: Anhforth <[email protected]>
Signed-off-by: Anhforth <[email protected]>
Signed-off-by: Anhforth <[email protected]>
Signed-off-by: Anhforth <[email protected]>
Signed-off-by: Anhforth <[email protected]>
No description provided.