morphecore was an experimental lecture-performance that probed new possibilities for performance at the intersection of neuroscience and dance.
Armed with MRI and brain decoding technology, the project reconstructed bodily poses from visual cortex activity, generating choreography that was further manipulated to test a range of physical variables—from the effects of gravity to muscle elasticity and joint rotation—in an exploration of what modes of dance might arise free from real-world constraints.
The results were presented in a video narrated by a Daito Manabe avatar, culminating in a dance performance by this digital Daito that became increasingly abstract as it transcended the physical limits of the human body.
The dance performance drew on 3D scan data of Daito Manabe, visual cortex activity recorded by fMRI, and motion capture data from Shingo Okamoto and ELEVENPLAY dancers.
By parsing and reconstructing dance as three constituent elements—pose, motion, and choreography—we sought to probe modes of physical expression free from the constraints of gravity and the physical limits of the body’s range of motion. Further studying the “noise” and “glitches” that arise in chaotic neural processes, the project anticipated a future when dance might be generated by sound stimuli to the brain that produce an interactive response in the body.
As the coronavirus pandemic prevented the gathering of new data from in-person test subjects, the work was produced as a prototype based on simulations created with reference to prior fMRI data and procedures acquired in 2018.
Although a future actual data set will inevitably elicit different results, the brain decoding methodologies underlying the project remain the same.
It is a work in progress.
[Research]
Since 2014, Daito Manabe has experimented with the brain decoding technology being researched by Dr. Yukiyasu Kamitani at Kyoto University.
“Brain decoding” seeks to look into the mind’s eye by reconstructing images seen by test subjects based on brain activity in their cerebral cortex.
In 2018, Manabe adapted this technique in a series of installations and live shows that generated images imagined when listening to music, in a novel departure from conventional artificial synesthesia and VJ approaches to visualization that rely on music waveform and spectrums.
In morphecore, brain decoding technology was used to extract poses imagined in the mind. These poses were assembled with motion data and choreography to create a dancing CG Manabe avatar.
[Award]
Prix Ars Electronica 2022
Interactive Art +
Honorary Mentions
[Presentation List]
2020
Sep.28 - Sep.19
streaming for Sónar+D CCCB 2020 Music
(Barcelona, Spain)
Dec.11
MUTEK.JP
at Shibuya Stream Hall in Shibuya,
(Tokyo, Japan)
Nov.25 - Dec.5
RealMix Festival
Online Event ( Bogotá , Columbia)
2021
Mar.20-Jun.22
"rhizomatiks_multiplex"
at Museum of Contemporary Art Tokyo
(Tokyo, Japan)
Apr.26-May.1
CAIROTRONICA-Cairo Electric and Media Art Festival
at the Factory and Tahrir Cultural Center (Cairo, Egypt)
Sep-Aug 2022
"SEEING The INVISIBLE -AN AUGMENTED REALITY CONTEMPORARY ART EXHIBITION"
at
Royal Botanic Garden Victoria Cranbourne, Australia,
Royal Botanic Garden Victoria Melbourne, Australia,
Royal Botanic Garden, Burlington, Canada,
Jerusalem Botanical Garden, Israel
Kirstenbosch Botanical Garden, South Africa
Eden Project, United Kingdom
Tucson Botanical Gardens,USA
San Diego Botanic Garden,USA
Denver Botanic Gardens,USA
Marie Selby Botanical Gardens, USA
Elm Bank Garden Massachusetts, Horticultural Society, USA...
https://seeingtheinvisible.art/daito-manabe/
(Jerusalem, Israel)
Oct.5
Streaming for 751 International Design Festival
(Beijin, China)
Nov.5-15
Online streaming for
Daito Manabe “morphecore prototype 2021 version” on ZER01NE DAY 2021
(Seoul, Korea)
Jan.21
Online streaming for FESTIVAL MULTIPLICIDADE 20_21
( Rio de Janeiro, Brasil)
Armed with MRI and brain decoding technology, the project reconstructed bodily poses from visual cortex activity, generating choreography that was further manipulated to test a range of physical variables—from the effects of gravity to muscle elasticity and joint rotation—in an exploration of what modes of dance might arise free from real-world constraints.
The results were presented in a video narrated by a Daito Manabe avatar, culminating in a dance performance by this digital Daito that became increasingly abstract as it transcended the physical limits of the human body.
The dance performance drew on 3D scan data of Daito Manabe, visual cortex activity recorded by fMRI, and motion capture data from Shingo Okamoto and ELEVENPLAY dancers.
By parsing and reconstructing dance as three constituent elements—pose, motion, and choreography—we sought to probe modes of physical expression free from the constraints of gravity and the physical limits of the body’s range of motion. Further studying the “noise” and “glitches” that arise in chaotic neural processes, the project anticipated a future when dance might be generated by sound stimuli to the brain that produce an interactive response in the body.
As the coronavirus pandemic prevented the gathering of new data from in-person test subjects, the work was produced as a prototype based on simulations created with reference to prior fMRI data and procedures acquired in 2018.
Although a future actual data set will inevitably elicit different results, the brain decoding methodologies underlying the project remain the same.
It is a work in progress.
[Research]
Since 2014, Daito Manabe has experimented with the brain decoding technology being researched by Dr. Yukiyasu Kamitani at Kyoto University.
“Brain decoding” seeks to look into the mind’s eye by reconstructing images seen by test subjects based on brain activity in their cerebral cortex.
In 2018, Manabe adapted this technique in a series of installations and live shows that generated images imagined when listening to music, in a novel departure from conventional artificial synesthesia and VJ approaches to visualization that rely on music waveform and spectrums.
In morphecore, brain decoding technology was used to extract poses imagined in the mind. These poses were assembled with motion data and choreography to create a dancing CG Manabe avatar.
[Award]
Prix Ars Electronica 2022
Interactive Art +
Honorary Mentions
[Presentation List]
2020
Sep.28 - Sep.19
streaming for Sónar+D CCCB 2020 Music
(Barcelona, Spain)
Dec.11
MUTEK.JP
at Shibuya Stream Hall in Shibuya,
(Tokyo, Japan)
Nov.25 - Dec.5
RealMix Festival
Online Event ( Bogotá , Columbia)
2021
Mar.20-Jun.22
"rhizomatiks_multiplex"
at Museum of Contemporary Art Tokyo
(Tokyo, Japan)
Apr.26-May.1
CAIROTRONICA-Cairo Electric and Media Art Festival
at the Factory and Tahrir Cultural Center (Cairo, Egypt)
Sep-Aug 2022
"SEEING The INVISIBLE -AN AUGMENTED REALITY CONTEMPORARY ART EXHIBITION"
at
Royal Botanic Garden Victoria Cranbourne, Australia,
Royal Botanic Garden Victoria Melbourne, Australia,
Royal Botanic Garden, Burlington, Canada,
Jerusalem Botanical Garden, Israel
Kirstenbosch Botanical Garden, South Africa
Eden Project, United Kingdom
Tucson Botanical Gardens,USA
San Diego Botanic Garden,USA
Denver Botanic Gardens,USA
Marie Selby Botanical Gardens, USA
Elm Bank Garden Massachusetts, Horticultural Society, USA...
https://seeingtheinvisible.art/daito-manabe/
(Jerusalem, Israel)
Oct.5
Streaming for 751 International Design Festival
(Beijin, China)
Nov.5-15
Online streaming for
Daito Manabe “morphecore prototype 2021 version” on ZER01NE DAY 2021
(Seoul, Korea)
Jan.21
Online streaming for FESTIVAL MULTIPLICIDADE 20_21
( Rio de Janeiro, Brasil)
Link
Credit
Motion capture dancer and choreographer: Shingo Okamoto
Supervisor: MIKIKO (ELEVENPLAY)
Music co-producer: Hopebox
Editing Director: Kenichiro Shimizu (PELE)
CG Director: Kenta Katsuno (+Ring)
Effects Artists: Tetsuro Takeuchi (quino grafix)
Effects Artists: Jun Satake (TMS JINNIS)
Effects Artists: Tai Komatsu (cai) ,Keisuke Toyoura (cai)
Effects Artists: Mikita Arai (Freelance)
Effects Artists: Tsukasa Iwaki (+Ring)
CG Producer: Toshihiko Sakata (+Ring)
Data Processing: 2bit
Motion Capture: Crescent, inc.
3D Scan: K’s DESIGN LAB
Compositor: Naoya Kawata (PELE)
Project Manager: Naoki Ishizuka (Rhizomatiks) + Yurino Nishina (PELE)
Producer: Takao Inoue (Rhizomatiks)
Supervisor: MIKIKO (ELEVENPLAY)
Music co-producer: Hopebox
Editing Director: Kenichiro Shimizu (PELE)
CG Director: Kenta Katsuno (+Ring)
Effects Artists: Tetsuro Takeuchi (quino grafix)
Effects Artists: Jun Satake (TMS JINNIS)
Effects Artists: Tai Komatsu (cai) ,Keisuke Toyoura (cai)
Effects Artists: Mikita Arai (Freelance)
Effects Artists: Tsukasa Iwaki (+Ring)
CG Producer: Toshihiko Sakata (+Ring)
Data Processing: 2bit
Motion Capture: Crescent, inc.
3D Scan: K’s DESIGN LAB
Compositor: Naoya Kawata (PELE)
Project Manager: Naoki Ishizuka (Rhizomatiks) + Yurino Nishina (PELE)
Producer: Takao Inoue (Rhizomatiks)