Evo 2 7B 262k Context

This is an intermediate version of Evo 2 7B, trained with up to a 262,144 token context length.

Evo 2 is a state-of-the-art DNA language model trained autoregressively on trillions of DNA tokens.

For instructions, details, and examples, please refer to the GitHub and paper.

Model Details

  • Base Model: Evo 2 7B
  • Context Length: 262,144 tokens (262k)
  • Parameters: 7B
  • Checkpoint: Iteration 12,500
  • Architecture: 32 layers

Main Evo 2 Checkpoints

Evo 2 40B and 7B checkpoints, trained up to 1 million sequence length, are available here:

Checkpoint name Num layers Num parameters
evo2_40b 50 40B
evo2_7b 32 7B

We also share 40B, 7B, and 1B base checkpoints trained on 8192 context length:

Checkpoint name Num layers Num parameters
evo2_40b_base 50 40B
evo2_7b_base 32 7B
evo2_1b_base 25 1B

Usage

Please refer to the Evo 2 GitHub repository for detailed usage instructions and examples.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Collection including arcinstitute/evo2_7b_262k