We are very happy to announce the:

Roundtable on Multimodal Speech Data

Data – Annotation – Strategies of Analysis

Thursday, 25. January 2024
(IG-NG 2.701)

Morning Session:

9:15 – 13:00: Open discussion on Multimodal Speech Data

  1.  Introduction of each group and data set (3 min. per group)
  2. Practical part: methods, annotation, analysis, tips and tricks
  3. Discussion: Theory-driven vs. data-driven approaches to multimodal data

Afternoon Session:

Time (CET)

Speaker

Title

14:15-14:35
 
 

Petra Wagner & Olcay Türk
(Bielefeld University)
 

Eliciting and Measuring Understanding in Interaction – (Some) Lessons Learned

14:35-14:55
 
 

Margaret Zellers
(Kiel University)
 

Placement and temporal alignment of complex gesture strokes in Luganda conversation

14:55-15:15
 
 

Sophie Repp & Cornelia Loos
(Universität zu Köln & Hamburg University)

Gesture in polar responses
 
 

15:15-15:35
 
 

Susanne Fuchs & Aleksandra Ćwiek
(Leibniz ZAS Berlin)
 

The Coordination of Dynamic Multimodal Signals in Novel Communication

15:35

coffee

 

16:05
 

Stefan Bauman
(Universität zu Köln)

Head gestures and pitch accents as cues to information status in French

16:25-16:45
 
 
 
 

MultIS
(Goethe University Frankfurt)
 
 
 

On eliciting multimodal data – the challenge between controlling the context and eliciting spontaneous speech and gesture
 

16:45-17:15

General Discussion