JOBS

Currently Available Jobs


Fall Internship Positions

If you are interested in any of the jobs listed below, please apply via the application form (Google account login required).

RAPT Project

Building and Improving the Design of a Rapport-Building Intelligent Tutoring Agent

Intelligent conversational agents are all around us, from Siri, to Alexa, and many more. In the Articulab, we study human behavior using computational linguistics and machine learning, to build virtual agents that can respond to people in increasingly more natural, social ways. What if you could have a virtual agent as a teacher? How would you want it to respond to the anxieties and social dynamics inherent in learning? In the Rapport-Aware Peer Tutor (RAPT) project, we study how the interpersonal closeness, or rapport, between people improves their learning, using computational tools to detect the verbal and nonverbal behaviors that contribute to that rapport, in order to build a socially-responsive virtual tutor.

This fall, we’re looking for students with backgrounds in computer science, natural language processing, machine learning, psychology, or HCI to help us build and improve the design of a rapport-building virtual agent, as part of an intelligent tutoring system. If you’re interested in applying for this research internship, please apply via the application form.

Validation of the Thin-Slicing Method for Sequential Social Signal Annotation.

We are looking for one or two talented students to help us validate the quality of annotations of rapport (and other social signals) using the thin-slicing method. We are investigating how social signals advance and control an interaction and the long-term bond between people. Social measures such as rapport or engagement change slowly during a conversation (however it remains unclear just how slowly) and human judgements on social behaviours have been shown to reach high levels of accuracy with only 30 seconds of observation (similar to basing judgements e.g. on full conversations). As a consequence, rapport is frequently annotated on consecutive 30s stretches of video/audio of interactions. However, it is unclear how well 30s intervals are suitable to capture the rate of change of rapport over the course of a conversation, and how the placement of relevant cues within the intervals influences the overall ratings.

In this internship you will learn how to prepare and analyse data, how to collect crowdsourced annotations, how to use NLP and multi-modal analysis tools to find relevant cues in a conversation. In addition, you will learn about the theories of social behaviours in conversation.

SCIPR Project

Developing an Intelligent Virtual Child to Raise Curiosity in Small Group Game Play

Curiosity inspires you to study your favorite subjects, and stay up late nights reading about topics you’re passionate about. Sadly, curiosity is becoming less common in elementary and middle schools in our increasingly test-oriented society. Our lab is developing an intelligent virtual child and interactive tabletop that will evoke curiosity, exploration, and self-efficacy through collaboration in a playful learning environment. For more details about the project, see the project description on our website.

We are looking for several interns with Computer Science background who are interested in Human-Computer Interaction, Artificial Intelligence, Natural Language Processing, as well as students in Linguistics, Psychology, Social science to take part in designing and developing the virtual child that can engage in the multi-party tangible tabletop game, and carry out verbal and non-verbal behaviors to elicit curiosity during the gameplay. Below we list the desired skills for each of the three sub-projects:

User interface design and development

  • Hands-on experience in user interface design (required), system development (required) and usability evaluation (preferred)
  • Strong programming skills with Java (required), JavaScript (required) and Python (preferred)
  • Experience in hardware programming (e.g. Arduino, RFID) (preferred)

Decision making module design and implementation

  • Hands-on experience in system development (required)
  • Course experience in Artificial Intelligence (preferred) and Natural Language Processing (preferred)
  • Strong programming skills with Java (required) and Python (preferred)

Human-human behavior analysis

  • Course or project experience with human behavior analysis through transcription and annotation
  • Previous experience with research on child development and STEM education (preferred)

SARA/InMind Project

We are inventing a personal assistant of the future – SARA, Socially-Aware Robot Assistant that can build a relationship with a user, and then employ that relationship to better achieve the user’s goals. SARA is an embodied intelligent personal assistant that analyses the user’s visual (head and face movement), vocal (acoustic features) and verbal (conversational strategies) behaviors to estimate, in real time, the level of interpersonal closeness that the user feels for the system, and then uses its own appropriate nonverbal behaviors (body movements synthesized on an intelligent agent), vocal (acoustic features) and verbal behaviors (language)  to maintain or increase interpersonal closeness so that the user is willing to disclose information that will allow the system to better serve the user’s goals. SARA was presented at the World Economic Forum 2017.

We are also implementing a virtual personal assistant that lives on your phone and gives you access to all that the web has to offer while learning about you and your preferences and using that information to get you to what you want more quickly, and to talk to you increasingly like an old friend rather than a brand-new business client. Indeed, Siri, Cortana, or even Alexa all act as if they have just met you. They never change the way they talk to you – or even how much they know about you. Our hypothesis is that virtual agents that build a relationship with users will generate loyalty to the brand AND be more effective at the task they’re performing.

For more information about these projects, see the InMind or SARA project description.

We are currently looking for interns who will work on the following aspects of the projects:

Data Collection for the Next Next Generation of Personal Assistant

In this internship, you’ll work in a team to collect longitudinal data from human/human interactions. The goal of this data collection is to understand how individuals’ interpersonal style evolve over time, and how the social strategies expressed by the participants are related to what they know about each other. While doing this you’ll learn what it takes to set up a data collection, make sure that all the audio and video inputs are synchronized, analyze the data collected, and use those data to improve the system. During the data analysis, you’ll determine and annotate the different conversational strategies used by both participants, and investigate how participants use their model of the other to change the course of the interaction.

2017 ArticuLab: we had 21 summer interns, some of whom are shown here,
as well as our lab director, graduate students, postdocs and staff.

2016 ArticuLab: we had 26 summer interns, some of whom are shown here,
as well as our lab director, graduate students, postdocs and staff.

2015 ArticuLab

page