Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add flax whisper implementation #20479

Merged
merged 125 commits into from
Feb 20, 2023
Merged
Changes from 3 commits
Commits
Show all changes
125 commits
Select commit Hold shift + click to select a range
7d3b6ef
add flax whisper implementation
andyehrenberg Nov 28, 2022
a9bed4c
rever change to setup
andyehrenberg Nov 28, 2022
0312993
remove unused imports
andyehrenberg Nov 28, 2022
c71fe4f
revert generation changes
andyehrenberg Nov 29, 2022
828d800
flax whisper docs
andyehrenberg Nov 29, 2022
baafb1c
docs
andyehrenberg Dec 1, 2022
7dba8b5
Merge branch 'huggingface:main' into flax_whisper
andyehrenberg Dec 1, 2022
2da5a58
import order
andyehrenberg Dec 1, 2022
5ee9c1f
Merge branch 'flax_whisper' of github.com:andyehrenberg/transformers …
andyehrenberg Dec 1, 2022
00f695f
import sorting
andyehrenberg Dec 1, 2022
0ecc03b
isort
andyehrenberg Dec 1, 2022
f66a005
add dummy objects
andyehrenberg Dec 1, 2022
175f344
doc formatting
andyehrenberg Dec 1, 2022
3329e6c
formatting
andyehrenberg Dec 1, 2022
c05089b
remove trailing whitespaces
andyehrenberg Dec 1, 2022
7551181
fix flax whisper docs
andyehrenberg Dec 1, 2022
153f2cb
Merge branch 'huggingface:main' into flax_whisper
andyehrenberg Dec 1, 2022
e255a97
add generation logic to unlock flax whisper
andyehrenberg Dec 2, 2022
f8009d7
Merge branch 'flax_whisper' of github.com:andyehrenberg/transformers …
andyehrenberg Dec 2, 2022
d003074
remove scans
andyehrenberg Dec 2, 2022
ba8a358
give credits to Flax Bart implementation
andyehrenberg Dec 2, 2022
9f4578d
remove unused imports
andyehrenberg Dec 2, 2022
be33fbd
add license
andyehrenberg Dec 2, 2022
8b1338b
remove assert
andyehrenberg Dec 2, 2022
c567f79
more credits to Bart
andyehrenberg Dec 2, 2022
fbe4e25
fix style
andyehrenberg Dec 2, 2022
cde5afd
formatting
andyehrenberg Dec 2, 2022
6aeb8c8
support left padding
andyehrenberg Dec 2, 2022
ec9ca19
add flax whisper generation test
andyehrenberg Dec 5, 2022
8bce923
Merge branch 'huggingface:main' into flax_whisper
andyehrenberg Dec 6, 2022
3f902f6
remove copied from comments whenever not a full copy
andyehrenberg Dec 7, 2022
3fd0a7c
fix docstrings for logits processors
andyehrenberg Dec 7, 2022
abc14a1
revert change to FlaxForceTokensLogitsProcessor
andyehrenberg Dec 7, 2022
d784a23
revert doc changes
andyehrenberg Dec 7, 2022
3dd8282
improve generation docs
andyehrenberg Dec 7, 2022
77fce32
reorganize
andyehrenberg Dec 7, 2022
fefefde
formatting
andyehrenberg Dec 7, 2022
04ad651
cleanup docs
andyehrenberg Dec 7, 2022
14e19c0
add tests
andyehrenberg Dec 7, 2022
cf67b38
handle empty list case
andyehrenberg Dec 7, 2022
3de7509
fix forced decoder ids in flax tests
andyehrenberg Dec 8, 2022
1077588
Merge branch 'huggingface:main' into flax_whisper
andyehrenberg Dec 9, 2022
5e2256a
add flax whisper to inits
andyehrenberg Dec 12, 2022
ada32b8
Merge branch 'flax_whisper' of github.com:andyehrenberg/transformers …
andyehrenberg Dec 12, 2022
669db4e
upate dummy objects
andyehrenberg Dec 12, 2022
bea6cf0
docs for FlaxAutoModelForSpeechSeq2Seq
andyehrenberg Dec 12, 2022
e4270b4
fix decoder_position_ids computation in pretrained model decode/__cal…
andyehrenberg Dec 14, 2022
135b634
add Copied from statements as necessary
andyehrenberg Dec 15, 2022
21fe767
compute position_ids only in __call__ and decode methods of pretraine…
andyehrenberg Dec 16, 2022
a901674
improve readabilityof compute positional embeddings
andyehrenberg Dec 16, 2022
f8d4686
check dimensionality of input_features instead of hidden_states
andyehrenberg Dec 16, 2022
b407611
copied from statement for init_cache
andyehrenberg Dec 16, 2022
8e78c86
formatting
andyehrenberg Dec 16, 2022
810358c
fix copies
andyehrenberg Dec 16, 2022
b06a6ba
fix copies
andyehrenberg Dec 16, 2022
45efd60
pass attention mask to encoder layers
andyehrenberg Dec 21, 2022
718f53b
fix decoder module outputs
andyehrenberg Dec 21, 2022
07a24a8
set dtype
andyehrenberg Dec 22, 2022
43c4ed8
smaller flax model for whisper test
andyehrenberg Dec 22, 2022
ecaac58
Merge branch 'flax_whisper' of github.com:andyehrenberg/transformers …
andyehrenberg Dec 22, 2022
7b35907
Update src/transformers/generation/flax_utils.py
andyehrenberg Dec 31, 2022
8a4d990
Update src/transformers/models/whisper/modeling_flax_whisper.py
andyehrenberg Dec 31, 2022
17c22fe
Update tests/models/whisper/test_modeling_flax_whisper.py
andyehrenberg Dec 31, 2022
8c021ae
cleanup
andyehrenberg Dec 31, 2022
2aed9af
Update src/transformers/models/whisper/modeling_flax_whisper.py
andyehrenberg Dec 31, 2022
64da8fa
bias cleanup
andyehrenberg Dec 31, 2022
6fc7404
Merge branch 'flax_whisper' of github.com:andyehrenberg/transformers …
andyehrenberg Dec 31, 2022
618f85b
doc fix
andyehrenberg Dec 31, 2022
8b56bf4
align style for force tokens processor
andyehrenberg Jan 2, 2023
209834d
readability
andyehrenberg Jan 3, 2023
fac30a0
fix input shape in tests
andyehrenberg Jan 3, 2023
aa87c98
revert FlaxGenerationMixin docstring
andyehrenberg Jan 3, 2023
23af05b
formatting
andyehrenberg Jan 3, 2023
b8086b6
fix tests
andyehrenberg Jan 3, 2023
acef3e0
fix imports
andyehrenberg Jan 3, 2023
da1df33
consistent encoder hidden states
andyehrenberg Jan 3, 2023
4cdba95
consistent hidden states
andyehrenberg Jan 3, 2023
dd7473b
input shapes
andyehrenberg Jan 3, 2023
c5621f7
typo
andyehrenberg Jan 3, 2023
46aec12
partial class trick
andyehrenberg Jan 3, 2023
a003616
partial class for input shape
andyehrenberg Jan 3, 2023
a9604a5
base_class with correct input shape
andyehrenberg Jan 3, 2023
5120afe
partial base classes
andyehrenberg Jan 3, 2023
c6b1ae4
match by name
andyehrenberg Jan 3, 2023
4c239fc
set main_input_name
andyehrenberg Jan 4, 2023
279ceb6
compare on names
andyehrenberg Jan 4, 2023
b81630e
Merge branch 'main' into flax_whisper
andyehrenberg Jan 9, 2023
797fab1
formatting
andyehrenberg Jan 9, 2023
f3173d8
remove unused import
andyehrenberg Jan 9, 2023
b4696ca
safer position ids computation
andyehrenberg Jan 10, 2023
1c11ca6
safer position id computation
andyehrenberg Jan 10, 2023
c128fd8
Update src/transformers/models/whisper/modeling_flax_whisper.py
andyehrenberg Jan 18, 2023
2ae5b08
Update src/transformers/models/whisper/modeling_flax_whisper.py
andyehrenberg Jan 18, 2023
48583bd
remove identical inherited tests
andyehrenberg Jan 18, 2023
c93232f
Merge branch 'flax_whisper' of github.com:andyehrenberg/transformers …
andyehrenberg Jan 18, 2023
1c18f61
fix prompt ids in tests
andyehrenberg Jan 18, 2023
c3b1d34
use generation config
andyehrenberg Jan 18, 2023
bf15d5f
use jnp array
andyehrenberg Jan 18, 2023
c5fc14b
better var names
andyehrenberg Jan 18, 2023
161cb8a
more explicit bias use
andyehrenberg Jan 18, 2023
d9cedb9
Merge branch 'main' into flax_whisper
andyehrenberg Jan 18, 2023
bb9d0af
import transformers
andyehrenberg Jan 18, 2023
f1d90d2
formatting
andyehrenberg Jan 18, 2023
733ae2b
test formatting
andyehrenberg Jan 18, 2023
6295691
remove unused imports
andyehrenberg Jan 18, 2023
902555e
remove unused imports
andyehrenberg Jan 18, 2023
cba4942
formatting
andyehrenberg Jan 18, 2023
0173945
isort
andyehrenberg Jan 18, 2023
48640e5
docs
andyehrenberg Jan 18, 2023
1daee2b
fix ln orders for encoder hidden states
andyehrenberg Jan 26, 2023
fdb0a61
Merge branch 'main' into flax_whisper
andyehrenberg Feb 3, 2023
632c4be
whisper unique generation stuff
andyehrenberg Feb 3, 2023
95403d6
Merge branch 'flax_whisper' of github.com:andyehrenberg/transformers …
andyehrenberg Feb 3, 2023
c5c3ac1
flake
andyehrenberg Feb 3, 2023
907905f
use finfo for attention bias
andyehrenberg Feb 3, 2023
9dbcda8
docs
andyehrenberg Feb 3, 2023
d36cd2c
Update src/transformers/generation/flax_utils.py
andyehrenberg Feb 14, 2023
ab01cfc
docs
andyehrenberg Feb 14, 2023
62d172a
add timestamp flax test
andyehrenberg Feb 14, 2023
455b8bf
jit for timestamps
andyehrenberg Feb 14, 2023
89658d0
formatting
andyehrenberg Feb 14, 2023
a75fd03
clean up timestamps processor
andyehrenberg Feb 15, 2023
758d56c
formatting
andyehrenberg Feb 15, 2023
f9ac652
remove if_true
andyehrenberg Feb 17, 2023
94a526e
cleanup
andyehrenberg Feb 17, 2023
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions src/transformers/models/whisper/modeling_flax_whisper.py
Original file line number Diff line number Diff line change
Expand Up @@ -99,7 +99,7 @@
be used by default. If you want to change padding behavior, you should modify to your needs. See diagram 1
in [the paper](https://arxiv.org/abs/1910.13461) for more information on the default strategy.
position_ids (`numpy.ndarray` of shape `(batch_size, sequence_length)`, *optional*):
Whisper does not use position_ids in the encoder as input_features is always the same size and doesn't use
Whisper does not use `position_ids` in the encoder as `input_features` is always the same size and doesn't use
masking, but this argument is preserved for compatibility. By default the silence in the input log mel
spectrogram are ignored.
decoder_position_ids (`numpy.ndarray` of shape `(batch_size, sequence_length)`, *optional*):
Expand Down Expand Up @@ -818,7 +818,7 @@ def __init__(

def init_weights(self, rng: jax.random.PRNGKey, input_shape: Tuple, params: FrozenDict = None) -> FrozenDict:
# init input tensors
input_features = jnp.zeros(input_shape)
input_features = jnp.zeros(input_shape, dtype="f4")
input_features = input_features.at[(..., -1)].set(self.config.eos_token_id)

decoder_input_ids = jnp.zeros((input_shape[0], 1), dtype="i4")
Expand Down