Model Selection - General
Model Configuration
The first configuration block of the Axolotl YAML configuration file is 'model type'. It comprises three main configurations.
base_model - This is the Huggingface model that contains *.pt, *.safetensors, or *.bin files
model_type - If you want to specify the type of model to load
tokenizer_type - Corresponding tokenizer for the model
Model Configuration example for Phi 2.0
base_model: microsoft/phi-2
model_type: AutoModelForCausalLM
tokenizer_type: AutoTokenizerModel Configuration example for Llama2
base_model: NousResearch/Llama-2-7b-hf
model_type: LlamaForCausalLM
tokenizer_type: LlamaTokenizer
is_llama_derived_model: trueModel Configuration example for TinyLLama
base_model: TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T
model_type: LlamaForCausalLM
tokenizer_type: LlamaTokenizerModel Configuration example for Code-LLama
base_model: codellama/CodeLlama-7b-hf
model_type: LlamaForCausalLM
tokenizer_type: CodeLlamaTokenizerWe will now move to setting up the Axolotl training configuration file for Phi 2.0.
Last updated
Was this helpful?

