Here are some of my projects and publications. You can find more coding projects on my Github profile.
We use reinforcement learning to learn grasping from RGB images on real robots. Robot learns emergent manipulation behaviors like retrial and object singulation!.
Careful camera & workspace calibration is not always possible when deploying robots in the real world. In this work we use a recurrent policy to adapt to unknown optical distortions in a robotic reaching task.
We came up with a simulated benchmark for robotic bin picking from RGB images and evaluated 6 off-policy learning algorithms on it.
We came up with a really data-efficient, unsupervised feature representation for robotic imitation learning. Using this as a state representation, we can get a robot to imitate a human pouring liquids with a single video demonstration (!!).
We taught a bunch of robotic arms to learn how to 1) detect 2) classify 3) pick up a variety of objects in a nearly autonomous setup.
A method for backpropagation through discrete categorical samples in neural networks.
A biophysical spiking neural network model of collision detection in Xenopus tadpoles.
Realtime graphics artwork on ShaderToy.
Visual Debugger for Deep Learning, built on TensorFlow.
Final project for my Chaotic Dynamics class (APMA136). I extended a paper to analyze the geometric structure of the submanifold dynamics of Echo State Networks.
Final project for my Neural Dynamics class (APMA2821V). I extended an existing biophysical cortex model to self-organize functional orientation maps and ocular dominance, alongside color orientation maps.
A high frequency trading bot that exploits price inefficiencies between bitcoin/altcoin markets for riskless profit. Does pairwise and triangular arbitrage.
Three team members and I implemented Disney's Material Point Simulation paper from scratch and extended it to work on the GPU.
CPU and CUDA implementations of SVD of 3x3 matrices in 2 microseconds.
Final project for the Pixar Undergraduate Program. I wrote a Renderman DSO in C++, RIB, Python, and RSL that allows a user to direct vegetation from the rendered perspective. Grass driven by Bezier curves and fractal brownian motion.
A classmate and I looked at how the physical spatial arrangement of neurons could change the polychronization dynamics of neural networks.
I wrote a spiking neural network simulator from scratch in Python and used it to model the dopaminergic-thalamic circuit. I coupled the neural simulation to a simulated robot created in Blender.