Consensus on SO(3)

I wrote simulations illustrating how decentralized algorithms may fail when they’re trying to bring objects into a common orientation. In mathspeak, I illustrated topological obstructions to consensus on SO(3). No special math knowledge is needed for the video below.

Stochastic Regularizers

Stochastic regularizers are tools used in neural networks to prevent overfitting. The most common one is batch normalization, and the second most common is dropout. Some of the explanations for how these tools work are just-so stories: dropout is supposed to fight “complex coadaptations”, which are hypothesized to be a distinct flavor of overfitting. (Interestingly, this has connections to biology; see here for a Twitter thread I wrote summarizing this.)

I wrote a report surveying the use of these regularizers in neural networks, and the state of research (as of late 2018) as to how they work. I did numerical experiments in which I tested neural networks trained with these methods and then applied dropout __at test time,__ which is unusual, to see if it could be used to quantify “complex coadaptations”. Report (pdf)

SQL and Music Theory

I made a prototype of a database for the harmonic content of pop songs. It uses a web-scraper to extract data from guitar tablature websites, guesses the key of the song if it can, and catalogues the harmonic information of the song in a SQL database with a snowflake schema.