Recently, I was working on a project that involved creating arbitrary stimulus waveforms from python arrays for patch-clamp ephys. While working on this, I wondered just how much I could push the cells' limits to respond to stimulus waveforms. I wanted to see if we could essentially 'write' text using patch-clamped neurons' membrane potential. From there, this bridged out into converting all kinds of images into stimulus waveforms to use in patch clamping. I hope with this project I could share just how cool the field of patch-clamp electrophysiology is:
Ideally, we can take a given image and convert it to an array of numbers, each representing the RGB code for a given pixel. This is common in python in the AI vision world, so we can leverage their code to import and transform our image:
import numpy as np
import matplotlib.pyplot as plt
from pyabf.abfWriter import writeABF1
from scipy.signal import savgol_filter
from PIL import Image
import plotly.express as px
#Open
im = Image.open("horse.jpg")
im = im.rotate(90)
plt.imshow(im)
print(np.array(im).shape)
#convert to black and white
Next, we want to resize the image to whatever our sweep x sample rate is. As it stands, a too-large image may mean you have too many data points, and your sweeps will be too long. It a too small of an image, and your sweeps will be too short
sweeplength = 1
dt = 1/10000
x_points = sweeplength/dt
im = im.resize((np.int(x_points), im.size[1]))
image_ar = np.array(im).astype(int)
#to grey_scale
image_ar = np.mean(image_ar, axis=-1)
#image_ar = (np.amin(image_ar, axis=-1)/250).astype(int)
plt.imshow(image_ar)
print(image_ar.shape)
It looks funky now, but that is okay! The next step is to crop out the number of sweeps. The above image has a size of 194 by 10000 pixels. If we take the 194 to be our sweep dimension, then at 1 second a sweep, this protocol would last 3.2 minutes. To reduce the number of sweeps, we can do one of two things:
I found with most images, option #2 was perfectly okay and preserved most the detail:
stepsize=2
image_ar = image_ar[::stepsize]
Finally we want to scale the image to an appropriate stimulus range for the neurons you patch. In may case a range of -10 to 10 pA was needed.
def min_max_scale(x, range=(0,1)):
x = (x - x.min()) / (x.max() - x.min())
x *= (range[1] + (-1*range[0]))
x -= -1*range[0]
return x
image_ar = min_max_scale(image_ar, (-10, 10))
To write the converted data to an abf file, we can use the excellent ABF writer included in the pyabf package.
writeABF1(image_ar, 'out.abf', (1/dt))
Its a bit messy looking but it worked!
To use this with clampex simply click edit protocol -> waveform -> stimulus file