目的 针对当前自动白平衡(AWB)方法在跨传感器应用中存在的泛化性差、数据采集成本高等问题,提出一种无需目标设备图像的跨设备AWB策略。方法 构建融合动态色温校准与语义特征建模的方法。在校正阶段,选取多个典型色温下的白点构建分段对角映射矩阵,用于将训练图像重构至目标传感器响应域,实现数据分布对齐;随后,在训练阶段引入语义特征提取模块,以增强模型对跨设备光源估计的鲁棒性。结果 实验在多个典型AWB数据集中均取得了领先的估计精度,具备良好的泛化能力和稳定性。结论 所提方法实现了无需目标图像参与的跨设备AWB估计,大幅降低了数据采集成本,并显著提升了模型的部署效率。
Abstract
To solve the problems of poor generalization and high data acquisition costs in cross-sensor applications of the current Auto White Balance (AWB) method, the work aims to propose a cross-device AWB strategy that requires no images from the target device. A method combining dynamic calibration at multiple typical color temperatures with semantic feature modeling was established. During the calibration, white points at multiple typical color temperatures were selected to construct a piecewise diagonal mapping matrix. This matrix was used to reconstruct training images into the target sensor's response domain, achieving data distribution alignment. Subsequently, in the training phase, a semantic feature extraction module was introduced to enhance the model's robustness for cross-device illuminant estimation. Experiments achieved excellent estimation accuracy across multiple standard AWB datasets, demonstrating strong generalization capability and stability. The proposed method enables cross-device AWB estimation without requiring any target device images, substantially reducing data acquisition costs and significantly improving model deployment efficiency.
关键词
白平衡 /
颜色恒常性 /
颜色校正 /
域适应
Key words
white balance /
color constancy /
color calibration /
domain adaptation
{{custom_sec.title}}
{{custom_sec.title}}
{{custom_sec.content}}
参考文献
[1] 申春辉, 郭陈凤, 周罗岚. 一种基于颜色恒常性的低照度图像增强方法[J]. 包装工程, 2017, 38(3): 134-138.
SHEN C H, GUO C F, ZHOU L L.Color Constancy-Based Image Enhancement Method under Poor Illumination[J]. Packaging Engineering, 2017, 38(3): 134-138.
[2] 王谦, 陈永利, 王佳辉, 等. 基于局部估计的多光源图像颜色恒常性计算[J]. 包装工程, 2017, 38(19): 213-217.
WANG Q, CHEN Y L, WANG J H, et al.Colour Constancy Calculation of Multi-Light Source Image Based on Local Estimation[J]. Packaging Engineering, 2017, 38(19): 213-217.
[3] YUE S, WEI M.Color Constancy from a Pure Color View[J]. JOSA A, 2023, 40: 602-610.
[4] 刘琳琳, 刘晓畅, 蒋斌, 等. 烟标二维码等级影响因素及其质量控制研究[J]. 包装工程, 2024, 45(15): 186-192.
LIU L L, LIU X C, JIANG B, et al.Level Affecting Factors and Quality Control of Cigarette Label QR Code[J]. Packaging Engineering, 2024, 45(15): 186-192.
[5] 屈永波, 张志坚, 王浩, 等. 基于轻量化YOLOv7的GDX2包装机烟组端面缺陷检测技术[J]. 包装工程, 2025, 46(9): 209-216.
QU Y B, ZHANG Z J, WANG H, et al.Lightweight YOLOv7-Based Defect Detection Technology for End Face of Cigarette Set in GDX2 Packaging Machine[J]. Packaging Engineering, 2025, 46(9): 209-216.
[6] HU Y M, WANG B Y, LIN S.FC4: Fully Convolutional Color Constancy with Confidence-Weighted Pooling[C]// 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). Honolulu: IEEE, 2017: 330-339.
[7] AFIFI M, BROWN M S. Sensor-Independent Illumination Estimation for DNN Models[J/OL]. Arxiv, 2019. https://arxiv.org/abs/1912.06888.
[8] AFIFI M, BARRON J T, LEGENDRE C, et al.Cross-Camera Convolutional Color Constancy[C]// 2021 IEEE/CVF International Conference on Computer Vision (ICCV). Montreal: IEEE, 2021: 1961-1970.
[9] XIAO J, GU S H, ZHANG L.Multi-Domain Learning for Accurate and Few-Shot Color Constancy[C]// 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). Seattle: IEEE, 2020: 3255-3264.
[10] YUE S W, WEI M C.Robust Pixel-Wise Illuminant Estimation Algorithm for Images with a Low Bit-Depth[J]. Optics Express, 2024, 32(15): 26708.
[11] YUE S W, WEI M C.Effective Cross-Sensor Color Constancy Using a Dual-Mapping Strategy[J]. Journal of the Optical Society of America A, Optics, Image Science, and Vision, 2024, 41(2): 329-337.
[12] BIANCO S, CUSANO C.Quasi-Unsupervised Color Constancy[C]// 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). Long Beach: IEEE, 2019: 12204-12213.
[13] HERNANDEZ-JUAREZ D, PARISOT S, BUSAM B, et al.A Multi-Hypothesis Approach to Color Constancy[C]// 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). Seattle: IEEE, 2020: 2267-2277.
[14] LO Y C, CHANG C C, CHIU H C, et al.CLCC: Contrastive Learning for Color Constancy[C]// 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). Nashville: IEEE, 2021: 8053-8063.
[15] HONG G W, LUO M R, RHODES P A.A Study of Digital Camera Colorimetric Characterization Based on Polynomial Modeling[J]. Color Research & Application, 2001, 26(1): 76-84.
[16] LAAKOM F, RAITOHARJU J, NIKKANEN J, et al.INTEL-TAU: A Color Constancy Dataset[J]. IEEE Access, 2021, 9: 39560-39567.
[17] CHENG D L, PRASAD D K, BROWN M S.Illuminant Estimation for Color Constancy: Why Spatial-Domain Methods Work and the Role of the Color Distribution[J]. Journal of the Optical Society of America A, 2014, 31(5): 1049-1058.
[18] BANIC N, KOSCEVIC K, LONCARIC S. Unsupervised Learning for Color Constancy[J/OL]. Arxiv, 2017. https://arxiv.org/abs/1712.00436.
[19] BUCHSBAUM G.A Spatial Processor Model for Object Colour Perception[J]. Journal of the Franklin Institute, 1980, 310(1): 1-26.
[20] LAND E H.The Retinex Theory of Color Vision[J]. Scientific American, 1977, 237(6): 108-128.
[21] VAN DE WEIJER J, GEVERS T, GIJSENIJ A. Edge-Based Color Constancy[J]. IEEE Transactions on Image Processing, 2007, 16(9): 2207-2214.
[22] BARRON J T, TSAI Y T.Fast Fourier Color Constancy[C]// Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 2017: 886-894.
基金
上海市人才项目(JS2024055);上海市东方英才教师计划