Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

move previous fused_adam and fp16_optimizer to contrib #517

Merged
merged 4 commits into from
Oct 4, 2019

Conversation

FDecaYed
Copy link
Contributor

@FDecaYed FDecaYed commented Oct 1, 2019

add back old fused_adam to contrib, which handle type casting, unscaling and clipping
move fp16_optimizer wrapper with it, which handle flattening, nan ckecking and loss scaling

User should be advised to use AMP with optimizers in apex.optimizers.

setup.py Outdated Show resolved Hide resolved
setup.py Outdated Show resolved Hide resolved
setup.py Outdated Show resolved Hide resolved
@vince62s
Copy link

vince62s commented Oct 3, 2019

is this somehow a workaround to #475 ?

@FDecaYed
Copy link
Contributor Author

FDecaYed commented Oct 3, 2019

is this somehow a workaround to #475 ?

We are adding this code path to help investigate the issue and we don't recommend switching back to this deprecated path.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants