Skip to content

Commit

Permalink
update llama2_70b_lora evaluation skipping per WG decision
Browse files Browse the repository at this point in the history
  • Loading branch information
itayhubara committed Apr 14, 2024
1 parent 03320d0 commit 9f32f97
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion training_rules.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -462,7 +462,7 @@ CLOSED: The same quality measure as the reference implementation must be used. T
|Language|Speech recognition |RNN-T|Every 1 epoch
| |NLP |BERT| eval_interval_samples=FLOOR(0.05*(230.23*GBS+3000000), 25000), skipping 0
| |large Language Model |GPT3| Every 24576 sequences. CEIL(24576 / global_batch_size) if 24576 is not divisible by GBS
| |large Language Model |Llama2_70B_LoRA| Every 384 sequences, CEIL(384 / global_batch_size) steps if 384 is not divisible by GBS. skipping first 3 evaluations
| |large Language Model |Llama2_70B_LoRA| Every 384 sequences, CEIL(384 / global_batch_size) steps if 384 is not divisible by GBS. Skipping first FLOOR(0.125*global_batch_size+2) evaluations
|Commerce|Recommendation |DLRMv2 (DCNv2)|Every FLOOR(TOTAL_TRAINING_SAMPLES / (GLOBAL_BATCH_SIZE * NUM_EVAL) samples, where TOTAL_TRAINING_SAMPLES = 4195197692 and NUM_EVAL = 20
|Graphs|Node classification|R-GAT|Evaluate 20 times per epoch
|===
Expand Down

0 comments on commit 9f32f97

Please sign in to comment.