Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for Flex Attention #1675

Draft
wants to merge 91 commits into
base: main
Choose a base branch
from
Draft
Changes from 1 commit
Commits
Show all changes
91 commits
Select commit Hold shift + click to select a range
f416539
adding flex attention
ShashankMosaicML Nov 18, 2024
ac3a884
registrifying score mods
ShashankMosaicML Nov 18, 2024
31b27e2
registrifying attention mask mods
ShashankMosaicML Nov 18, 2024
c8fffa5
Merge branch 'mosaicml:main' into shashank/flexattention
ShashankMosaicML Nov 18, 2024
86dce3b
bug_fix
ShashankMosaicML Nov 19, 2024
cb8f4a6
bug_fix
ShashankMosaicML Nov 19, 2024
902850a
lint
ShashankMosaicML Nov 19, 2024
9c9708d
configuring test
ShashankMosaicML Nov 19, 2024
f1ff430
configuring tests
ShashankMosaicML Nov 19, 2024
e537f5a
bug fix
ShashankMosaicML Nov 19, 2024
c527dd7
fixing alibi
ShashankMosaicML Nov 19, 2024
15e303e
Merge branch 'mosaicml:main' into shashank/flexattention
ShashankMosaicML Nov 19, 2024
c4ef5d9
configuring further tests
ShashankMosaicML Nov 19, 2024
6b37427
refactoring
ShashankMosaicML Nov 19, 2024
e30fe7a
adding warnings and errors
ShashankMosaicML Nov 19, 2024
924a53c
gating tests on torch version
ShashankMosaicML Nov 19, 2024
57048e3
Merge branch 'mosaicml:main' into shashank/flexattention
ShashankMosaicML Nov 19, 2024
67a2aea
reorganizing function defs
ShashankMosaicML Nov 19, 2024
04f3a62
refactoring
ShashankMosaicML Nov 19, 2024
ab6c58c
passing in dicts of mask and score mods
ShashankMosaicML Nov 19, 2024
3b3827d
making mask and score mods configurable via yaml
ShashankMosaicML Nov 19, 2024
be43e8d
Merge branch 'mosaicml:main' into shashank/flexattention
ShashankMosaicML Nov 19, 2024
2264f91
adding torch.compile
ShashankMosaicML Nov 20, 2024
e274d9f
..
ShashankMosaicML Nov 20, 2024
a26bb4f
..
ShashankMosaicML Nov 20, 2024
d5ab7d3
undoing comment out
ShashankMosaicML Nov 20, 2024
d40e978
Merge branch 'mosaicml:main' into shashank/flexattention
ShashankMosaicML Nov 26, 2024
5f13e7b
adding torch comile
ShashankMosaicML Nov 26, 2024
ca8e173
temporary commit commenting out block mask and score mod
ShashankMosaicML Nov 26, 2024
f5486ff
undoing prev temp commit
ShashankMosaicML Nov 26, 2024
fdced3a
Merge branch 'mosaicml:main' into shashank/flexattention
ShashankMosaicML Nov 26, 2024
c53db63
speeding up block mask generation
ShashankMosaicML Nov 27, 2024
ec5900d
precompilining create block mask
ShashankMosaicML Nov 27, 2024
02ad3b6
minor
ShashankMosaicML Nov 27, 2024
13a5fc8
compiling mask and flex attn once for the entire model
ShashankMosaicML Nov 27, 2024
2ae6027
..
ShashankMosaicML Nov 27, 2024
0c5150a
..
ShashankMosaicML Nov 27, 2024
ff28304
making sequence id transforms configurable
ShashankMosaicML Nov 27, 2024
23ba20f
..
ShashankMosaicML Nov 27, 2024
72c45ae
..
ShashankMosaicML Nov 27, 2024
73066a4
..
ShashankMosaicML Nov 27, 2024
9f616f7
..
ShashankMosaicML Dec 1, 2024
94ecade
converting mods from dict to list
ShashankMosaicML Dec 1, 2024
4b30130
switching off seq id masking if configured so
ShashankMosaicML Dec 2, 2024
9daf068
fix bug
ShashankMosaicML Dec 2, 2024
67aa900
fix bug
ShashankMosaicML Dec 2, 2024
65a0425
adding global and local window mask
ShashankMosaicML Dec 2, 2024
3443b69
..
ShashankMosaicML Dec 2, 2024
f6b3705
fixing test
ShashankMosaicML Dec 2, 2024
d5ff138
..
ShashankMosaicML Dec 2, 2024
43cb0d1
flex attn softcap only int
ShashankMosaicML Dec 2, 2024
f623a1f
..
ShashankMosaicML Dec 2, 2024
0fea56a
..
ShashankMosaicML Dec 2, 2024
eb6e792
..
ShashankMosaicML Dec 2, 2024
5852da0
..
ShashankMosaicML Dec 2, 2024
615a904
Merge branch 'mosaicml:main' into shashank/flexattention
ShashankMosaicML Dec 3, 2024
0474027
simplifying design
ShashankMosaicML Dec 4, 2024
70aa0c7
removing check_seq_id_attn_mask
ShashankMosaicML Dec 4, 2024
fc8a120
..
ShashankMosaicML Dec 4, 2024
5f88093
..
ShashankMosaicML Dec 4, 2024
661f7f6
fixing tests
ShashankMosaicML Dec 4, 2024
fef3a5d
..
ShashankMosaicML Dec 4, 2024
f6c66e8
..
ShashankMosaicML Dec 4, 2024
eacca42
Merge branch 'main' into shashank/flexattention
ShashankMosaicML Dec 4, 2024
4385f18
allowing block overrides for flex attention
ShashankMosaicML Dec 4, 2024
e17d1ff
..
ShashankMosaicML Dec 4, 2024
58760fc
configuring tests, fixing bugs
ShashankMosaicML Dec 4, 2024
f4ad493
fixing bug when using past kv caches
ShashankMosaicML Dec 4, 2024
67f9aae
bug fix
ShashankMosaicML Dec 4, 2024
5fcbc18
..
ShashankMosaicML Dec 4, 2024
8dfdedb
..
ShashankMosaicML Dec 4, 2024
8912cb2
fixing score mod bug
ShashankMosaicML Dec 5, 2024
18c4bb9
..
ShashankMosaicML Dec 5, 2024
bf1cb6c
..
ShashankMosaicML Dec 5, 2024
5093efd
..
ShashankMosaicML Dec 5, 2024
96b8f82
..
ShashankMosaicML Dec 5, 2024
f1ad991
..
ShashankMosaicML Dec 5, 2024
18afcc5
..
ShashankMosaicML Dec 5, 2024
434aa83
configuring with torch 2.5.1 and 2.6.0.dev
ShashankMosaicML Dec 5, 2024
216fcb9
configuring more tests with torch 2.5.1 and 2.6.0.dev
ShashankMosaicML Dec 5, 2024
438e0f3
..
ShashankMosaicML Dec 5, 2024
2bb25ee
..
ShashankMosaicML Dec 6, 2024
9831b5e
..
ShashankMosaicML Dec 6, 2024
ad601e4
..
ShashankMosaicML Dec 6, 2024
77115c5
..
ShashankMosaicML Dec 6, 2024
dfde51b
figuring out d_model and seq lengths for which flex attention works
ShashankMosaicML Dec 8, 2024
d1d04ce
adding todos
ShashankMosaicML Dec 8, 2024
5eca05f
Merge branch 'main' into shashank/flexattention
ShashankMosaicML Dec 8, 2024
718d89d
adding test for local global attention
ShashankMosaicML Dec 9, 2024
135abd7
Merge branch 'main' into shashank/flexattention
ShashankMosaicML Dec 17, 2024
369e818
Merge branch 'main' into shashank/flexattention
ShashankMosaicML Jan 7, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
..
  • Loading branch information
ShashankMosaicML committed Dec 5, 2024
commit 96b8f82e4b5bfaf2e1a94360012c076ebaddb17b
5 changes: 3 additions & 2 deletions llmfoundry/models/mpt/modeling_mpt.py
Original file line number Diff line number Diff line change
Expand Up @@ -424,12 +424,13 @@ def __init__(self, config: MPTConfig):
self.shift_labels = True

if self.attn_impl == 'flex':
flex_attn_compile = config.attn_config.pop('flex_attn_compile')
self.compiled_flex_attention = torch.compile(
flex_attention,
) if config.attn_config['flex_attn_compile'] else flex_attention
) if flex_attn_compile else flex_attention
self.compiled_create_block_mask = torch.compile(
create_block_mask,
) if config.attn_config['flex_attn_compile'] else create_block_mask
) if flex_attn_compile else create_block_mask

self.blocks = self.construct_blocks(config=config,)

Expand Down
Loading