2nd Conversational Intelligence Summer School
About school
This is planned as a one-week intensive program that includes instruction and programming tutorials in the morning, invited speaker colloquium after lunch, followed by supervised project work in the late afternoon. Instruction will include an introduction to the principles of deep neural networks used for natural language processing, including recurrent neural networks, convolutional networks, and sequence-to-sequence encoder-decoder architectures with attention, as well as vector space lexical embeddings, and other more advanced topics. Participants will work in teams to build working prototype conversational agent projects to be released at the end of the session.

Hosted by UMass Lowell, summer school offers participants a chance to build solid understanding of core conversational AI concepts and techniques.
Requirements
The intended audience of this summer school consists of 1st and 2nd year PhD and Masters' students, advanced undergraduate students, and industry participants
One semester of machine learning, with some exposure to neural network models, background in natural language processing not required by desirable
Participants
Price
Participation is free to students
Qualification
Participants will be selected using a set of qualifying problems
Member
Maximum number of participants: 50
Invited Speakers
Jason Weston
FAIR
Alexander Rush
Dept of Computer Science, Harvard University
João Sedoc
Dept of Computer and Information Science, University of Pennsylvania
Kate Saenko
Dept of Computer Science, Boston University
Instructors
Instructors will include the organizers as well as several researchers from the organizers' research labs.
Text Machine Lab
University of Massachusetts Lowell
Yuanliang Meng, Peter Potash, Alexey Romanov
Neural Networks and Deep Learning Lab
Mikhail Arkhipov, Yuri Kuratov, Maria Vikhreva, Alexey Sorokin, Dilyara Baymurzina
Schedule
Monday, June 24
9:00–11:00
Lectures
Topic 1:
  • Word representations
  • Lexical embeddings
  • TF-IDF
Topic 2:

  • Neural architectures for text processing
  • Loss functions and backpropagation
  • Dropout and other regularization methods
11:30–13:00
Tutorials
Sentence classification with word embeddings.
14:30–16:00
Invited Talk
Review talk
16:00 - 22:00
Project Session
Team building. Projects setting.
Tuesday, June 25
9:00–11:00
Lectures
Topic 1
  • ConvNets
Topic 2
  • RNNs (LSTM,GRU)
  • Sequence-to-sequence encoder/decoder models
  • Sequence-to-sequence with attention
  • Contextualized embeddings: ELMo; BERT
11:30–13:00
Tutorials
Sequence tagging for entity recognition.
14:30–16:00
Invited Talk
Review talk
16:00 - 22:00
Project Session
Working on projects
Wednesday, June 26
9:00–11:00
Lectures
Topic 1
  • Attention-based architectures: transformers
Topic 2
  • Memory-based models
11:30–13:00
Tutorials
Chit-chat bot trained on Twitter/Open subtitles.
14:30–16:00
Invited Talk
Review talk
16:00 - 22:00
Project Session
Working on projects
Thuesday, June 27
9:00–11:00
Lectures
Topic 1
  • Attention-based architectures: transformers
Topic 2
  • Memory-based models
    11:30–13:00
    Tutorials
    Working on projects
    14:30–16:00
    Invited Talk
    Review talk
    16:00 - 22:00
    Project Session
    Working on projects
    Friday, June 28
    9:00–11:00
    Lectures
    Advanced Topics: Dialogue Generation and Conversational Agents
    • Multi-skilled agents
    • Hierarchical models (HRED, VHRED, etc.)
    • External knowledge integration
      11:30–13:00
      Tutorials
      Working on projects
      14:30–16:00
      Invited Talk
      Review talk
      16:00 - 22:00
      Project Session
      Prepare posters
      Saturday, June 29
      9:00–13:00
      Closing reception
      Short oral presentations
      Poster session
        15:00–21:00
        Social event
        Sample Topics for Participants Projects
        Participants will select from a number of projects for which initial code base will be made available. Sample topics include:
        Engaging chit-chat model to keep interlocutor engaged in the conversation using Reddit
        Infotainment chat model using Wizard of Wikipedia dataset
        Multi-task goal-oriented chat model using MultiWOZ dataset (A Large-Scale Multi-Domain Wizard-of-Oz Dataset for Task-Oriented Dialogue Modelling)
        Computational resources
        A teaching lab with 30 GPU workstations will be available at UMass Lowell for use by the participants
        Additional remote virtual GPU machines can be provided by the MIPT lab
        Participants will also be able to use colaboratory
        Important dates
        Feb - March 2019
        Feb - March 2019
        Qualification round
        April 2019
        April 2019
        Qualifying problem scoring, participant selection
        24 - 29 June 2019
        24 - 29 June 2019
        Summer school
        General partners
        Made on
        Tilda