← Back to all transforms
GlassBlur
Description
Apply a glass blur effect to the input image. This transform simulates the effect of looking through textured glass by locally shuffling pixels in the image. It creates a distorted, frosted glass-like appearance. Args: sigma (float): Standard deviation for the Gaussian kernel used in the process. Higher values increase the blur effect. Must be non-negative. Default: 0.7 max_delta (int): Maximum distance in pixels for shuffling. Determines how far pixels can be moved. Larger values create more distortion. Must be a positive integer. Default: 4 iterations (int): Number of times to apply the glass blur effect. More iterations create a stronger effect but increase computation time. Must be a positive integer. Default: 2 mode (Literal["fast", "exact"]): Mode of computation. Options are: - "fast": Uses a faster but potentially less accurate method. - "exact": Uses a slower but more precise method. Default: "fast" p (float): Probability of applying the transform. Should be in the range [0, 1]. Default: 0.5 Targets: image Image types: uint8, float32 Number of channels: Any Note: - This transform is particularly effective for creating a 'looking through glass' effect or simulating the view through a frosted window. - The 'fast' mode is recommended for most use cases as it provides a good balance between effect quality and computation speed. - Increasing 'iterations' will strengthen the effect but also increase the processing time linearly. Example: >>> import numpy as np >>> import albumentations as A >>> image = np.random.randint(0, 256, (100, 100, 3), dtype=np.uint8) >>> transform = A.GlassBlur(sigma=0.7, max_delta=4, iterations=3, mode="fast", p=1) >>> result = transform(image=image) >>> glass_blurred_image = result["image"] References: - This implementation is based on the technique described in: "ImageNet-trained CNNs are biased towards texture; increasing shape bias improves accuracy and robustness" https://arxiv.org/abs/1903.12261 - Original implementation: https://github.com/hendrycks/robustness/blob/master/ImageNet-C/create_c/make_imagenet_c.py
Parameters
- sigma: float (default: 0.7)
- max_delta: int (default: 4)
- iterations: int (default: 2)
- mode: Literal['fast', 'exact'] (default: 'fast')
- p: float (default: 0.5)
Targets
- Image
Try it out
ⓘ
Original Image:
Result:
Transform result will appear here