GPT-IMAGE-EDIT-1.5M

A Million-Scale, GPT-Generated Image Dataset

teaser image

An illustration of our GPT-Image-Edit-1.5M Dataset.

Abstract

We systematically construct this dataset by leveraging the versatile capabilities of GPT-4o to unify and refine three popular image-editing datasets: OmniEdit, HQ- Edit, and UltraEdit. Specifically, our methodology involves 1. regenerating output images to enhance visual quality and instruction alignment, and 2. selectively rewriting prompts to improve semantic clarity. To validate the efficacy of our dataset, we fine advanced open-source models on GPT-IMAGE-EDIT-1.5M. The empirical results are exciting — e.g., the fine-tuned FluxKontext achieves highly competitive performance across a comprehensive suite of benchmarks, including 7.24@GEdit-EN, 3.80@ImgEdit-Full, and 8.78@Complex-Edit, showing stronger instruction following and higher perceptual quality while maintaining identity . These scores markedly exceed all previously published open-source methods and substantially narrow the gap to leading proprietary models. We hope the full release of GPT-IMAGE-EDIT-1.5M can help to catalyze further open re- search in instruction-guided image editing.

Dataset
data pipeline
An overview of our data generation pipeline.
Results
Quantitative Results
Img-Edit-EN
G-Edit-Benchmark-EN
OmniContext-Single
Complex-Edit
Qualitative Results
Img-Edit-EN
G-Edit-Benchmark-EN
OmniContext
Complex-Edit

BibTeX

@misc{wang2025gptimageedit15mmillionscalegptgeneratedimage,
    title={GPT-IMAGE-EDIT-1.5M: A Million-Scale, GPT-Generated Image Dataset}, 
    author={Yuhan Wang and Siwei Yang and Bingchen Zhao and Letian Zhang and Qing Liu and Yuyin Zhou and Cihang Xie},
    year={2025},
    eprint={2507.21033},
    archivePrefix={arXiv},
    primaryClass={cs.CV},
    url={https://arxiv.org/abs/2507.21033}, 
}

Acknowledge

We would like to thank Ashwin Nagarajan, Tejas Polu, Jiawei Mao, Zeyu Wang and Haoqin Tu for the early discussion and exploration of this project. We would like to thank the Microsoft Accelerate Foundation Models Research Program for supporting our computing needs.