PyTorch Precision Converter is a robust utility tool designed to convert the tensor precision of PyTorch model checkpoints and safetensors files. With the increasing need for efficient model deployment on various platforms, especially where memory or computational efficiency is paramount, converting models to reduced precision formats like fp16 or bf16 can be immensely beneficial. This tool provides the flexibility to convert not just traditional PyTorch checkpoints but also models saved in the custom safetensors format.
Key features include:
- Conversion of models to different precision formats.
- Option to select between different model configurations, like full model, only the Exponential Moving Average (EMA) parameters, or excluding the EMA parameters.
- Saving capability in both the custom
safetensorsformat and the usual PyTorch checkpoint format.
- Multiple Source Formats: Load models from both PyTorch checkpoints and
safetensorsfiles. - Precision Conversion: Convert tensors to different precision formats:
fp32,fp16, andbf16. - Model Type Conversion: Choose to convert:
- The full model
- Only the EMA parameters
- Exclude the EMA parameters
- Format Flexibility: Save converted models in either the
safetensorsformat or the standard PyTorch checkpoint format.
- Python 3.7 or higher
- PyTorch 1.7.0 or higher
safetensorslibrary
-
Clone the GitHub repository:
git clone https://github.com/angelolamonaca/PyTorch-Precision-Converter.git cd PyTorch-Precision-Converter -
Install the required packages:
pip install -r requirements.txt
Use the converter through the command line:
python converter.py -f <path_to_model> -p <precision> -t <conversion_type> -st-for--file: Path to the model (PyTorch checkpoint orsafetensorsfile). Default ismodel.ckpt.-por--precision: Desired tensor precision (fp32,fp16, orbf16). Default isfp32.-tor--type: Conversion type (full,ema-only,no-ema). Default isfull.-stor--safe-tensors: Flag to save the model insafetensorsformat. By default, it saves in PyTorch format.
Convert a model saved in safetensors format to half precision (fp16):
python converter.py -f my_model.safetensors -p fp16Convert only the EMA parameters of a PyTorch checkpoint to bf16 and save in safetensors format:
python converter.py -f my_model.ckpt -p bf16 -t ema-only -stWe welcome contributions to the PyTorch Precision Converter. If you wish to contribute, kindly follow the standard GitHub pull request process:
- Fork the repository.
- Clone, create a new branch, make changes, and push them to your fork.
- Open a pull request.
Ensure that your code adheres to the style and conventions of the existing codebase.
This project is licensed under the MIT License - see the LICENSE file for details.
Special thanks to the PyTorch team and the OpenAI community for their continuous contributions to the machine learning ecosystem.