
About Myself
I'm Noah Schliesman, an EE student at the University of San Diego working with Dr. Kramer to use Kalman Filtering to measure uncertainty and denoise GPS signals—eventually with the goal of applying Kalman's extension of state-space theory to neural networks. I'm targeting a PhD candidacy starting Fall 2025 in Neural Networks, Natural Language Processing, and Transformers. Here is some of my current work, where I explore key concepts graphically to gain and share intuition for the field. Please reach out if you are interested or want to talk.

Original Work and Primers
Sinusoidal Weights for Fourier Expressibility Hilbert Spaces Binary Classification in Supervised Learning RNNs, GRUs, and LSTMs Attention and Transformer Architectures DeepSeek-R1Annotated Papers
Perceptron Kalman Filter Backpropagation Backpropagation via Lagrangians Optimal Brain Damage Bidirectional RNNs Vanishing Gradient Support Vector Networks Greedy Layerwise Training Autoencoders Curriculum Learning ImageNet Classification with CNNs Word2Vec Seq2Seq Translation via Additive Attention Neural GPU Transformers m-gramsOriginal Animations
Single Neuron Activation