Sitemap

A list of all the posts and pages found on the site. For you robots out there is an XML version available for digesting as well.

Pages

Posts

Future Blog Post

less than 1 minute read

Published:

This post will show up by default. To disable scheduling of future posts, edit config.yml and set future: false.

Blog Post number 4

less than 1 minute read

Published:

This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.

Blog Post number 3

less than 1 minute read

Published:

This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.

Blog Post number 2

less than 1 minute read

Published:

This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.

Blog Post number 1

less than 1 minute read

Published:

This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.

portfolio

publications

(Preprint) Towards Understanding Neural Collapse: The Effects of Batch Normalization and Weight Decay

Published in Arxiv Preprint, 2023

Neural Collapse (NC) is a geometric structure recently observed in the final layer of neural network classifiers. In this paper, we investigate the interrelationships between batch normalization (BN), weight decay, and proximity to the NC structure. Our work introduces the geometrically intuitive intra-class and inter-class cosine similarity measure, which encapsulates multiple core aspects of NC. Leveraging this measure, we establish theoretical guarantees for the emergence of NC under the influence of last-layer BN and weight decay, specifically in scenarios where the regularized cross-entropy loss is near-optimal. Experimental evidence substantiates our theoretical findings, revealing a pronounced occurrence of NC in models incorporating BN and appropriate weight-decay values. This combination of theoretical and empirical insights suggests a greatly influential role of BN and weight decay in the emergence of NC.

Recommended citation: Leyan Pan and Xinyuan Cao. Towards understanding neural collapse: The effects of Batch Normalization and Weight Decay. Arxiv Preprint, 2023. https://arxiv.org/abs/2309.04644

talks

teaching

Graduate Teaching Assistant for CS 4510 Automata and Complexity

Undergraduate course, Georgia Institute of Technology, School of Computer Science, 2022

Answers in-class questions, hosts office hours, and grade exams and homework for ~100 student for CS 4510 Automata and Complexity (Spring 2022, Fall 2022, Spring 2023) instructed by Prof. Zvi Galil at Georgia Tech.

Graduate Teaching Assistant of CS 6262 Network Security

graduate course, Georgia Institute of Technology, School of Cybersecurity and Privacy, 2023

Answered online questions, hosted office hours, and graded homeworks for CS 6262 Network Security instructed by Prof. Wenke Lee at Georgia Tech.