Excited to be sharing another blog post, this time diving into the topic of regularisation in neural networks! Some popular regularisation techiniques can initially seem quite bizarre, but after reading about their central importance in deep learning I wanted to dig deeper for myself. I give both experimental and theoretical justifications for some common regularisation techniques, which I hope makes the topic both interesting and accessible. Excited to hear your thoughts! https://lnkd.in/d-6prCCJ
Thomas Kite, PhD’s Post
More Relevant Posts
-
Bridging the gap between Deep Learning and explainable algorithms. Deep neural networks learn fragile "shortcut" features, rendering them difficult to interpret (black box) and vulnerable to adversarial attacks. This paper proposes semantic features as a general architectural solution to this problem. The main idea is to make features locality-sensitive in the adequate semantic topology of the domain, thus introducing a strong regularization. The proof of concept network is lightweight, inherently interpretable and achieves almost human-level adversarial test metrics - with no adversarial training! Can't wait to hear your feedback!
A Conceptual Framework For White Box Neural Networks
arxiv.org
To view or add a comment, sign in
-
Many researchers using AI in wireless communication are focusing on the results and comparisons with other networks or classical methods. A lot of time has been spent in companies on understanding the limitations, robustness of classical methods and optimization from complexity perspective. I believe this is also important when people replace classical algorithms/blocks with NNs, and researchers in academia should also consider this important aspect. There is a whole theory on generalization in deep learning topic and this paper can be a good starting point:
1710.05468
arxiv.org
To view or add a comment, sign in
-
4 GitHub Repos for AI/ML 1. Neural Networks: introductions to Neural Networks, from Andrej Karpathy. https://lnkd.in/erA-Gerp 2. Machine Learning Specialization - Andrew Ng’s lectures & practice problems https://lnkd.in/ebtWemWD 3. Deep Learning Specialization - Andrew Ng’s lectures & practice problems https://lnkd.in/e2RECCSQ 4. minGPT - Introduction to transformers, from Andrej Karpathy. https://lnkd.in/e_NsHVRn
GitHub - karpathy/nn-zero-to-hero: Neural Networks: Zero to Hero
github.com
To view or add a comment, sign in
-
Two months ago, I only had a vague idea of how neural networks were constructed and used. After this course, I've been able to create a full transformer model and I am super proud of this achievement that required hours of work, watching videos, and hands-on labs practicing. If you strive to learn and be curious like me, I can only recommend this course to understand better the world of deep learning.
Completion Certificate for Deep Learning
coursera.org
To view or add a comment, sign in
-
Week#5 Deep Learning Basics II: Backpropagation and Gradient Descent. In this part of deep learning, backpropagation and gradient descent are essential for training neural networks. Backpropagation calculates the error's impact on each weight, while gradient descent adjusts these weights to minimize the loss function. Through iterative weight updates, the network becomes more accurate over time. Here's a simple implementation of gradient descent for linear regression, showcasing how weight adjustments reduce prediction errors. Mastering these concepts is key to building efficient neural networks.
To view or add a comment, sign in
-
After a long battle, I can finally say it - I have completed Deep Learning Course made by DeepLearning.AI. 😁 Reflecting back, I can't stress how exciting this course is, getting an pretty decent view into Neural Networks, Computer Vision, NLP and even foundations for building my own LLM 😅 Thank you very much for this opportunity and I hope to put my knowledge to use with some ML project in the future 🚀
Completion Certificate for Deep Learning
coursera.org
To view or add a comment, sign in
-
Aside from reviewing how to create CNNs and RNNs (Convolutional Neural Networks and Recurrent Neural Networks, respectively) using PyTorch, I think another big takeaway from this course is an introduction (discussed on the last chapter) to the creation of multi-input and multi-output models. Although I know that one can create such models, it is my first formal introduction to the topic. Overall, this is another great course from DataCamp, especially those who are really interested with doing deep learning using PyTorch. I admit that I still have to master everything taught in the course, and one way is to redo all the exercises conducted. (I just lately realized that one can download the datasets used in the course from the course page itself. My bad.) #deeplearning #pytorch
Adrian Josele Quional's Statement of Accomplishment | DataCamp
datacamp.com
To view or add a comment, sign in
-
I have always been passionate about deep learning. Creating neural networks that learn and improve over time is like feeding a newborn. But sometimes I wonder “but what is really happening”. Yes we say how it happens, the concept in broad terms but nothing more. We use it and that's it! Of course you don't have to reinvent the wheel but sometimes it is essential to understand the internal mechanism to better take advantage of the technology offered to you. Today, I found my happiness in this book, and I hope it will be the same for you. Good reading
To view or add a comment, sign in
-
Dive into the fascinating world of neural networks and deep learning with our latest blog post. Whether you're a novice or a seasoned pro, there's something for everyone in this comprehensive guide. Discover how these powerful algorithms are transforming technology and shaping our future.
Neural Networks: Basics of Deep Learning
nolojik.com
To view or add a comment, sign in
-
Hi, I wrote an article about TensorFlow: A Deep Learning Library. Check it out, and thanks 😊 #TensorFlow #DeepLearning #AI
TensorFlow : A Deep Learning Library
python.plainenglish.io
To view or add a comment, sign in
Data Scientist at Volcore
8moAwesome thumbnail 👌🏼 Excited to read it!