
Tyler Chang
Graduate student @ University of California San Diego
transformer
multilingual language models
language modeling
multilingual nlp
self-attention
agent
language acquisition
abstraction
representation
human-robot interaction
convolution
geometry
linguistic structure
out-of-domain generalization
word learning
5
presentations
3
number of views
SHORT BIO
Tyler Chang is a cognitive science PhD student at UCSD, affiliated with the Halıcıoğlu Data Science Institute. He is interested in how people, machines, and artificial agents learn, comprehend, and produce language. His recent work has focused on the analysis of large language models, particularly during pre-training.
Presentations

When Is Multilinguality a Curse? Language Modeling for 250 High- and Low-Resource Languages
Tyler Chang and 3 other authors

Structural Priming Demonstrates Abstract Grammatical Representations in Multilingual Language Models
James Michaelov and 3 other authors

Characterizing and Measuring Linguistic Dataset Drift
Tyler Chang and 6 other authors

The Geometry of Multilingual Language Model Representations
Tyler Chang and 2 other authors

Word Acquisition in Neural Language Models
Tyler Chang and 1 other author