First, all the imports we'll use later.
import io
import math
from typing import Tuple, List
import svgwrite
from PIL import Image
from deltae2000 import delta_e_cie2000
from neataco import AntColonyOptimization
from neatnearestneighbour import NearestNeighbour
from IPython import display
This image was taken with Lomochrome Purple, a film by Lomography that alters the colour spectrum to purple hues.
filepath = "picnic.jpg"
display.Image(filename=filepath, width=400, height=400)
I use the PIL library to open the image and then compress it with low jpeg quality. This crushes the colours making similar colours near eachother come out as a middle-ground between them. See the banding happening as the colours only shift when they become too distance from eachother.
compressed_image = io.BytesIO()
with open(filepath, "rb") as image_file:
image = Image.open(image_file)
image = image.convert("RGB")
image.save(compressed_image, format="jpeg", quality=5)
display.Image(data=compressed_image.getvalue(), height=400, width=400)
Take a sampling of colours from the image. I want about 200 colours in by gradient, so I will only take the 200 most common colours from the compressed image.
percentile = 0.5
count = 200
RGB = Tuple[int, int, int]
# Get the number of pixels of each colour in the image
colours: List[Tuple[int, RGB]] = image.getcolors(maxcolors=1_658_137)
# Sort them by most to least nuber of pixels
colours = sorted(colours, key=lambda el: el[0], reverse=True)
total = len(colours)
rgb_sample: List[RGB] = []
for i, (_, colour) in enumerate(colours):
p = i / total
if p < percentile or i < count:
continue
rgb_sample.append(colour)
rgb_sample = rgb_sample[0:count]
The draw_colours
function will make an SVG that can be displayed in the Jupyter notebook or saved to a file. This first image is the sample taken from the image.
def draw_colours(colours):
draw = svgwrite.Drawing(size=(f"{400}px", f"{100}px"))
draw.add(draw.rect((0, 0), ("100%", "100%"), fill="white"))
for i, colour in enumerate(colours):
draw.add(
draw.rect(
(i * 2, 0),
('3px', '100px'),
fill=f"rgb({colour[0]}, {colour[1]}, {colour[2]})",
)
)
return draw.tostring()
display.SVG(data=draw_colours(rgb_sample))
My first attempt at creating a gradient is using a Nearest Neighbours algorithm. I wrote a new Python package implementing this algorithm that can be seen at https://gitlab.com/landreville/neatnearestneighbour. The Euclidean distance between each RGB colour is used to find each nearest neighbour. The resulting image has a visible gradient from green to pink, but it clearly has entirely different colours mixed together. There are green and purple and pink beside eachother. There are light and dark lines alternating.
nn = NearestNeighbour(
rgb_sample,
# Euclidean distance
lambda a, b: math.sqrt((b[0] - a[0])**2 + (b[1] - a[1])**2 + (b[2] - a[2])**2)
)
sorted_sample = nn.run()
display.SVG(data=draw_colours(sorted_sample))
I'm going to convert the colours from RGB to CIELAB colour space. This colour space includes a measurement of perceptual lightness that can help ensure that light and dark colours do not come up together and upset the gradient. These next two functions convert RGB to LAB and vice versa. The math required was helpfully documented on this website http://www.brucelindbloom.com/. Including a helpful note about how the original specification for CIELAB had a rounding error that needed to be fixed.
from collections import namedtuple
# The delta e cie2000 distance calc expects these fields
LabColour = namedtuple('LabColour', ('lab_l', 'lab_a', 'lab_b'))
ϵ = 216 / 24389
k = 24389 / 27
def rgb_to_lab(rgb):
convert = lambda c: c/12.92 if c <= 0.04045 else ((c+0.055)/1.055)**2.4
r, g, b = (convert(val/255.0) for val in rgb)
x = 0.4124 * r + 0.3576 * g + 0.1805 * b
y = 0.2126 * r + 0.7152 * g + 0.0722 * b
z = 0.0193 * r + 0.1192 * g + 0.9505 * b
labconv = lambda t: t**(1.0/3.0) if t > ϵ else (k*t + 16) / 116.0
fy = labconv(y)
l = 116 * fy - 16
a = 500 * (labconv(x/0.9481) - fy)
b = 200 * (fy - labconv(z/1.073))
return LabColour(l, a, b)
lab_sample = [rgb_to_lab(rgb) for rgb in rgb_sample]
ϵ = 216 / 24389
k = 24389 / 27
def lab_to_rgb(lab):
l, a, b = lab
fy = (l + 16) / 116
fx = a / 500 + fy
fz = fy - b / 200
fx3 = fx**3
fz3 = fz**3
x = fx3 if fx3 > ϵ else (116*fx - 16) / k
y = ((l + 16) / 116)**3 if l > k*ϵ else l / k
z = fz3 if fz3 > ϵ else (116*fz - 16) / k
r = 3.2406 * x + -1.5372 * y + -0.4986 * z
g = -0.9689 * x + 1.8758 * y + 0.0415 * z
b = 0.0557 * x + -0.2040 * y + 1.0570 * z
scale = lambda c: 12.92 * c if c <= 0.0031308 else 1.055 * c**(1/2.4) - 0.055
r = max(scale(r), 0)
g = max(scale(g), 0)
b = max(scale(b), 0)
return (round(r * 255.0), round(g * 255.0), round(b * 255.0))
After converting to LAB and sorting the colours by lightness there is a clear gradient from dark to light, but still the colours are mixed together.
light_sorted = sorted(lab_sample, key=lambda el: el.lab_l)
display.SVG(data=draw_colours([
lab_to_rgb(colour) for colour in light_sorted
]))
Now that we're using CIELAB colour space we can also use a distance calculation that is based on visible perception. I wrote a pure Python implementation of Delta-E CIE2000, because I'm running this with PyPy and want the JIT compiler to do its magic. This result is starting to look like a gradient, but there is still out of place lines that just don't look right. The next step is to move on from Nearest Neighbour and try a different way to sort the colours.
nn = NearestNeighbour(light_sorted, delta_e_cie2000)
sorted_sample = nn.run()
display.SVG(data=draw_colours([
lab_to_rgb(colour) for colour in sorted_sample
]))
I realized that sorting the colours based on visual perception is essentially the travelling salesperson problem. What is the optimal route through all of the sample colours that will be the shortest perceptual distance overall? Solving that problem should result in a gradient that does not have jarring lines due to high perceptual distance between consecutive colours. I wrote a general Ant Colony Optimization algorithm in Python available at https://gitlab.com/landreville/neataco. I use the Delta-E CIE2000 function to find the distance between each colour (the cities if we follow the travelling salesperson analogy) and run ACO with the LAB colour sample.
aco = AntColonyOptimization(
light_sorted,
delta_e_cie2000,
ant_count=4096,
evaporation=0.2,
alpha=5,
beta=40,
)
sorted_sample = aco.run()
display.SVG(data=draw_colours([
lab_to_rgb(colour) for colour in sorted_sample
]))
This final result has an aesthetically pleasing gradient. The combination of lightness and colour taking a natural shift is clear compared to the previous outputs.
Below is an example with another photo. The result here has smooth gradients although there is still banding between colours. This can happen because the image itself doesn't have the colours to make a smooth transition between colours.
display.Image(filename='fuchsia.jpg', width=400, height=400)
display.SVG(filename='fuchsia-gradient.svg')