We are interested in developing novel algorithms for uncertainty estimation in computer vision. Our research includes improved trainable calibration methods for neural networks, neural network calibration for medical imaging classification using DCA regularization, and neural network decision-making criteria consistency analysis via input sensitivity. We focus on developing methods that provide reliable uncertainty estimates, particularly for applications in medical imaging and other safety-critical domains where accurate confidence measures are essential.