Salesforce/wikisql
Updated • 2.38k • 123
How to use smangrul/tinyllama_lora_sql with PEFT:
from peft import PeftModel
from transformers import AutoModelForCausalLM
base_model = AutoModelForCausalLM.from_pretrained("TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T")
model = PeftModel.from_pretrained(base_model, "smangrul/tinyllama_lora_sql")This model is a fine-tuned version of TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T on the wikisql dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss |
|---|---|---|---|
| 0.0482 | 1.0 | 263 | 0.0457 |
from peft import PeftModel from transformers import AutoModelForCausalLM base_model = AutoModelForCausalLM.from_pretrained("TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T") model = PeftModel.from_pretrained(base_model, "smangrul/tinyllama_lora_sql")