How to reconstruct from brain images which letter a person was reading | KurzweilAI: The researchers “taught” the model how 1200 voxels (volumetric pixels) of 2x2x2 mm from the brain scans correspond to individual pixels in different versions of handwritten letters...
“In our further research we will be working with a more powerful MRI scanner,” said Sanne Schoenmakers, who is working on a thesis about decoding thoughts. “Due to the higher resolution of the scanner, we hope to be able to link the model to more detailed images. We are currently linking images of letters to 1200 voxels in the brain; with the more powerful scanner we will link images of faces to 15,000 voxels.”
Post a Comment