Overview

Recent exploration shows that LLMs, e.g., ChatGPT, may pass the Turing test in human-like chatting but have limited capability even for simple reasoning tasks (Biever, 2023). It remains unclear whether LLMs reason or not (Melanie, 2023). Human reasoning has been characterized as a dual-process phenomenon (Sun, 2023) or as mechanisms fast and slow thinking (Kahneman, 2011). These findings suggest two directions for exploring neural reasoning: starting from existing neural networks to enhance the reasoning performance with the target of symbolic-level reasoning, and starting from symbolic reasoning to explore its novel neural implementation. These two directions will ideally meet somewhere in the middle and will lead to representations that can act as a bridge for novel neural computing, which qualitatively differs from traditional neural networks, and for novel symbolic computing, which inherits the good features of neural computing. Hence the name of our workshop, with a focus on Natural Language Processing and Knowledge Graph reasoning. This workshop promotes research in both directions, particularly seeking novel proposals from the second direction.

For paper submissions, please use the following link: Submission Link.

Invited Speakers

Pascale Fung

Pascale Fung

The Hong Kong University of Science and Technology

Alessandro Lenci

Alessandro Lenci

University of Pisa

Volker Tresp

Volker Tresp

Ludwig-Maximilians-University Munich

Juanzi Li

Juanzi Li

Tsinghua University

Organizers

Tiansi Dong

Tiansi Dong

Fraunhofer IAIS

Erhard Hinrichs

Erhard Hinrichs

University of Tübingen

Zhen Han

Zhen Han

Amazon Inc.

Kang Liu

Kang Liu

Chinese Academy of Sciences

Yangqiu Song

Yangqiu Song

The Hong Kong University of Science and Technology

Yixin Cao

Yixin Cao

Singapore Management University

Christian F. Hempelmann

Christian F. Hempelmann

Texas A&M-Commerce

Rafet Sifa}

Rafet Sifa

University of Bonn