I am fine tuning palm2-chat bison model with my custom dataset. how can I reduce the resources used during tuning of the model as my dataset is very small.
Solved! Go to Solution.
Before fine-tuning, explore prompt engineering, zero-shot, and few-shot learning for better accuracy with a smaller dataset. You can choose the number of training steps and tuning hardware (8 A100 80GB GPUs or 64 cores TPU V3 pod) based on your region (us-central1 or europe-west4).
I hope this is helpful!
Before fine-tuning, explore prompt engineering, zero-shot, and few-shot learning for better accuracy with a smaller dataset. You can choose the number of training steps and tuning hardware (8 A100 80GB GPUs or 64 cores TPU V3 pod) based on your region (us-central1 or europe-west4).
I hope this is helpful!
User | Count |
---|---|
2 | |
1 | |
1 | |
1 | |
1 |