Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support Adapt.jl v4 #2374

Merged
merged 3 commits into from
Jan 30, 2024
Merged

Support Adapt.jl v4 #2374

merged 3 commits into from
Jan 30, 2024

Conversation

vpuri3
Copy link
Contributor

@vpuri3 vpuri3 commented Jan 30, 2024

cc: @maleadt. Hopefully this should just work: I didn't find any reference to Adapt.eltype or Adapt.ndims in the repo with grep.

@vpuri3
Copy link
Contributor Author

vpuri3 commented Jan 30, 2024

@christiangnrd
Copy link
Contributor

That's because AMDGPU.jl doesn't yet support LLVM 16 which is Julia nightly's current version.

@ToucheSir
Copy link
Member

This is a dupe of #2362, FYI.

@vpuri3
Copy link
Contributor Author

vpuri3 commented Jan 30, 2024

Ah, glad it's being worked on. Sorry for the dupe

@vpuri3
Copy link
Contributor Author

vpuri3 commented Jan 30, 2024

@ToucheSir , just spoke to @christiangnrd . We're abandoning #2362 in favor of this PR. Can you run workflows here?

@christiangnrd
Copy link
Contributor

christiangnrd commented Jan 30, 2024

I did not say that and I don’t have any authority to make those decisions.

What I meant is that the other PR doesn’t update the Metal compat so if you remove the adapt fix here it could (not would) get merged for that and then allow the tests on the original PR to pass.

@vpuri3
Copy link
Contributor Author

vpuri3 commented Jan 30, 2024

Apologies for mis-phrasing.

The changes in #2362 are already in this PR, along with the Metal compat update. So this PR should (hopefully) pass.

Copy link
Member

@ToucheSir ToucheSir left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I guess opening a new PR worked out because the Metal.jl compat bump was also required, thanks. Could you bump the package version quickly so that we can tag immediately after merging?

@vpuri3
Copy link
Contributor Author

vpuri3 commented Jan 30, 2024

@ToucheSir done. Thank you

@ToucheSir ToucheSir merged commit 3eed5fa into FluxML:master Jan 30, 2024
6 of 8 checks passed
@vpuri3 vpuri3 deleted the patch-1 branch January 30, 2024 23:11
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants