Building Relief maps with Real Neurons

Recently, I was working on a project that involved creating arbitrary stimulus waveforms from python arrays for patch-clamp ephys. While working on this, I wondered just how much I could push the cells' limits to respond to stimulus waveforms. I wanted to see if we could essentially 'write' text using patch-clamped neurons' membrane potential. From there, this bridged out into converting all kinds of images into stimulus waveforms to use in patch clamping. I hope with this project I could share just how cool the field of patch-clamp electrophysiology is:

Out[1]:
Out[2]:
Out[3]:

How it works:

Ideally, we can take a given image and convert it to an array of numbers, each representing the RGB code for a given pixel. This is common in python in the AI vision world, so we can leverage their code to import and transform our image:

In [4]:
import numpy as np
import matplotlib.pyplot as plt
from pyabf.abfWriter import writeABF1
from scipy.signal import savgol_filter

from PIL import Image
import plotly.express as px
#Open
im = Image.open("horse.jpg")
im = im.rotate(90)
plt.imshow(im)
print(np.array(im).shape)
#convert to black and white
C:\Users\SMest\Anaconda3\lib\site-packages\numpy\_distributor_init.py:32: UserWarning: loaded more than 1 DLL from .libs:
C:\Users\SMest\Anaconda3\lib\site-packages\numpy\.libs\libopenblas.PYQHXLVVQ7VESDPUVUADXEVJOBGHJPAY.gfortran-win_amd64.dll
C:\Users\SMest\Anaconda3\lib\site-packages\numpy\.libs\libopenblas.WCDJNK7YVMPZQ2ME2ZZHJJRJ3JIKNDB7.gfortran-win_amd64.dll
  stacklevel=1)
(194, 259, 3)

Next, we want to resize the image to whatever our sweep x sample rate is. As it stands, a too-large image may mean you have too many data points, and your sweeps will be too long. It a too small of an image, and your sweeps will be too short

In [5]:
sweeplength = 1
dt = 1/10000
x_points = sweeplength/dt
im = im.resize((np.int(x_points), im.size[1]))
image_ar = np.array(im).astype(int)
#to grey_scale
image_ar = np.mean(image_ar, axis=-1)
#image_ar = (np.amin(image_ar, axis=-1)/250).astype(int)
plt.imshow(image_ar)
print(image_ar.shape)
(194, 10000)

It looks funky now, but that is okay! The next step is to crop out the number of sweeps. The above image has a size of 194 by 10000 pixels. If we take the 194 to be our sweep dimension, then at 1 second a sweep, this protocol would last 3.2 minutes. To reduce the number of sweeps, we can do one of two things:

  1. take the mean of every X rows
  2. take every X row

I found with most images, option #2 was perfectly okay and preserved most the detail:

In [6]:
stepsize=2
image_ar = image_ar[::stepsize]

Finally we want to scale the image to an appropriate stimulus range for the neurons you patch. In may case a range of -10 to 10 pA was needed.

In [7]:
def min_max_scale(x, range=(0,1)):
    x = (x - x.min()) / (x.max() - x.min())
    x *= (range[1] + (-1*range[0]))
    x -= -1*range[0]
    return x
image_ar = min_max_scale(image_ar, (-10, 10))

To write the converted data to an abf file, we can use the excellent ABF writer included in the pyabf package.

In [8]:
writeABF1(image_ar, 'out.abf', (1/dt))
INFO:pyabf.abfWriter:Creating an ABF1 file 1.94 MB in size ...
INFO:pyabf.abfWriter:wrote out.abf

Its a bit messy looking but it worked! clampfit.png

To use this with clampex simply click edit protocol -> waveform -> stimulus file clampex.png