Decoding syllables from human fMRI activity

Yohei Otaka, Rieko Osu, Mitsuo Kawato, Meigen Liu, Satoshi Murata, Yukiyasu Kamitani

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Citations (Scopus)

Abstract

Language plays essential roles in human cognition and social communication, and therefore technology of reading out speech using non-invasively measured brain activity will have both scientific and clinical merits. Here, we examined whether it is possible to decode each syllable from human fMRI activity. Four healthy subjects participated in the experiments. In a decoding session, the subjects repeatedly uttered a syllable presented on a screen at 3Hz for a 12-s block. Nine different syllables are presented in a single experimental run which was repeated 8 times. We also specified the voxels which showed articulation-related activities by utterance of all the syllables in Japanese phonology in a conventional task-rest sequence. Then, we used either all of these voxels or a part of these voxels that exist in anatomically specified ROIs (M1, cerebellum) during decoding sessions as data samples for training and testing a decoder (linear support vector machine) that classifies brain activity patterns for different syllables. To evaluate decoding performance, we performed cross-validation by testing the sample of one decoding session using a decoder trained with the samples of the remaining sessions. As a result, syllables were correctly decoded at above-chance levels. The results suggest the possibility of using non-invasively measured brain activity to read out the intended speech of disabled patients in speech motor control.

Original languageEnglish
Title of host publicationNeural Information Processing - 14th International Conference, ICONIP 2007, Revised Selected Papers
Pages979-986
Number of pages8
EditionPART 2
DOIs
Publication statusPublished - 23-10-2008
Event14th International Conference on Neural Information Processing, ICONIP 2007 - Kitakyushu, Japan
Duration: 13-11-200716-11-2007

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
NumberPART 2
Volume4985 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Other

Other14th International Conference on Neural Information Processing, ICONIP 2007
CountryJapan
CityKitakyushu
Period13-11-0716-11-07

Fingerprint

Functional Magnetic Resonance Imaging
Decoding
Voxel
Brain
Testing
Cerebellum
Motor Control
Decode
Support vector machines
Cognition
Cross-validation
Support Vector Machine
Classify
Human
Magnetic Resonance Imaging
Communication
Evaluate
Experiments
Experiment
Speech

All Science Journal Classification (ASJC) codes

  • Theoretical Computer Science
  • Computer Science(all)

Cite this

Otaka, Y., Osu, R., Kawato, M., Liu, M., Murata, S., & Kamitani, Y. (2008). Decoding syllables from human fMRI activity. In Neural Information Processing - 14th International Conference, ICONIP 2007, Revised Selected Papers (PART 2 ed., pp. 979-986). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 4985 LNCS, No. PART 2). https://doi.org/10.1007/978-3-540-69162-4_102
Otaka, Yohei ; Osu, Rieko ; Kawato, Mitsuo ; Liu, Meigen ; Murata, Satoshi ; Kamitani, Yukiyasu. / Decoding syllables from human fMRI activity. Neural Information Processing - 14th International Conference, ICONIP 2007, Revised Selected Papers. PART 2. ed. 2008. pp. 979-986 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); PART 2).
@inproceedings{ee4bdfc2940c43faacd4c2d5f9207c31,
title = "Decoding syllables from human fMRI activity",
abstract = "Language plays essential roles in human cognition and social communication, and therefore technology of reading out speech using non-invasively measured brain activity will have both scientific and clinical merits. Here, we examined whether it is possible to decode each syllable from human fMRI activity. Four healthy subjects participated in the experiments. In a decoding session, the subjects repeatedly uttered a syllable presented on a screen at 3Hz for a 12-s block. Nine different syllables are presented in a single experimental run which was repeated 8 times. We also specified the voxels which showed articulation-related activities by utterance of all the syllables in Japanese phonology in a conventional task-rest sequence. Then, we used either all of these voxels or a part of these voxels that exist in anatomically specified ROIs (M1, cerebellum) during decoding sessions as data samples for training and testing a decoder (linear support vector machine) that classifies brain activity patterns for different syllables. To evaluate decoding performance, we performed cross-validation by testing the sample of one decoding session using a decoder trained with the samples of the remaining sessions. As a result, syllables were correctly decoded at above-chance levels. The results suggest the possibility of using non-invasively measured brain activity to read out the intended speech of disabled patients in speech motor control.",
author = "Yohei Otaka and Rieko Osu and Mitsuo Kawato and Meigen Liu and Satoshi Murata and Yukiyasu Kamitani",
year = "2008",
month = "10",
day = "23",
doi = "10.1007/978-3-540-69162-4_102",
language = "English",
isbn = "3540691596",
series = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",
number = "PART 2",
pages = "979--986",
booktitle = "Neural Information Processing - 14th International Conference, ICONIP 2007, Revised Selected Papers",
edition = "PART 2",

}

Otaka, Y, Osu, R, Kawato, M, Liu, M, Murata, S & Kamitani, Y 2008, Decoding syllables from human fMRI activity. in Neural Information Processing - 14th International Conference, ICONIP 2007, Revised Selected Papers. PART 2 edn, Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), no. PART 2, vol. 4985 LNCS, pp. 979-986, 14th International Conference on Neural Information Processing, ICONIP 2007, Kitakyushu, Japan, 13-11-07. https://doi.org/10.1007/978-3-540-69162-4_102

Decoding syllables from human fMRI activity. / Otaka, Yohei; Osu, Rieko; Kawato, Mitsuo; Liu, Meigen; Murata, Satoshi; Kamitani, Yukiyasu.

Neural Information Processing - 14th International Conference, ICONIP 2007, Revised Selected Papers. PART 2. ed. 2008. p. 979-986 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 4985 LNCS, No. PART 2).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

TY - GEN

T1 - Decoding syllables from human fMRI activity

AU - Otaka, Yohei

AU - Osu, Rieko

AU - Kawato, Mitsuo

AU - Liu, Meigen

AU - Murata, Satoshi

AU - Kamitani, Yukiyasu

PY - 2008/10/23

Y1 - 2008/10/23

N2 - Language plays essential roles in human cognition and social communication, and therefore technology of reading out speech using non-invasively measured brain activity will have both scientific and clinical merits. Here, we examined whether it is possible to decode each syllable from human fMRI activity. Four healthy subjects participated in the experiments. In a decoding session, the subjects repeatedly uttered a syllable presented on a screen at 3Hz for a 12-s block. Nine different syllables are presented in a single experimental run which was repeated 8 times. We also specified the voxels which showed articulation-related activities by utterance of all the syllables in Japanese phonology in a conventional task-rest sequence. Then, we used either all of these voxels or a part of these voxels that exist in anatomically specified ROIs (M1, cerebellum) during decoding sessions as data samples for training and testing a decoder (linear support vector machine) that classifies brain activity patterns for different syllables. To evaluate decoding performance, we performed cross-validation by testing the sample of one decoding session using a decoder trained with the samples of the remaining sessions. As a result, syllables were correctly decoded at above-chance levels. The results suggest the possibility of using non-invasively measured brain activity to read out the intended speech of disabled patients in speech motor control.

AB - Language plays essential roles in human cognition and social communication, and therefore technology of reading out speech using non-invasively measured brain activity will have both scientific and clinical merits. Here, we examined whether it is possible to decode each syllable from human fMRI activity. Four healthy subjects participated in the experiments. In a decoding session, the subjects repeatedly uttered a syllable presented on a screen at 3Hz for a 12-s block. Nine different syllables are presented in a single experimental run which was repeated 8 times. We also specified the voxels which showed articulation-related activities by utterance of all the syllables in Japanese phonology in a conventional task-rest sequence. Then, we used either all of these voxels or a part of these voxels that exist in anatomically specified ROIs (M1, cerebellum) during decoding sessions as data samples for training and testing a decoder (linear support vector machine) that classifies brain activity patterns for different syllables. To evaluate decoding performance, we performed cross-validation by testing the sample of one decoding session using a decoder trained with the samples of the remaining sessions. As a result, syllables were correctly decoded at above-chance levels. The results suggest the possibility of using non-invasively measured brain activity to read out the intended speech of disabled patients in speech motor control.

UR - http://www.scopus.com/inward/record.url?scp=54049120292&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=54049120292&partnerID=8YFLogxK

U2 - 10.1007/978-3-540-69162-4_102

DO - 10.1007/978-3-540-69162-4_102

M3 - Conference contribution

AN - SCOPUS:54049120292

SN - 3540691596

SN - 9783540691594

T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

SP - 979

EP - 986

BT - Neural Information Processing - 14th International Conference, ICONIP 2007, Revised Selected Papers

ER -

Otaka Y, Osu R, Kawato M, Liu M, Murata S, Kamitani Y. Decoding syllables from human fMRI activity. In Neural Information Processing - 14th International Conference, ICONIP 2007, Revised Selected Papers. PART 2 ed. 2008. p. 979-986. (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); PART 2). https://doi.org/10.1007/978-3-540-69162-4_102