Home
A Common Misconception About Scaling Neural Network Inputs
Alex-AI
Sep 26, 2021
644 views
138 - The need for scaling, dropout, and batch normalization in deep learning
Bellman Equation Derived In Excruciatingly Baby Steps
Artificial neural networks (ANN) - explained super simple
Key Query Value Attention Explained
The spelled-out intro to neural networks and backpropagation: building micrograd
Aggregating Nested Transformers
Should You Scale Your Data ??? : Data Science Concepts
Why Neural Networks can learn (almost) anything
But what *is* a Neural Network? - THE MATH YOU SHOULD KNOW!
Why and When Should we Perform Feature Normalization?
Backpropagation Algorithm | Neural Networks
Standardization vs Normalization Clearly Explained!
But what is a neural network? | Chapter 1, Deep learning
Tensors for Neural Networks, Clearly Explained!!!
Getting Matrix Dimensions Right (C1W4L03)
[AI] How to normalize and un-normalize a tabular data for neural networks?
Dendrites: Why Biological Neurons Are Deep Neural Networks
Feature Extraction With TorchVision's Newest Utility