Hands-on NumPy(IV): Universal Functions and Array-oriented Programming
Universal functions (ufunc) are special NumPy functions that operate on ndarrays in an element-by-element fashion.
They represent a vast array of vectorized functions that perform much better than iterative implementations and let you write concise code. Most ufuncs achieve this by providing a Python wrapper around a C implementation.
In this ...
Hands-on NumPy(III): Indexing and slicing
NumPy array indexing is a big topic, and there are many different ways of selecting elements from an array.
Let’s start with the simplest case: selecting an entry from a 1-dimensional array.
import numpy as np
arr = np.arange(10)
print(arr)
[0 1 2 3 4 5 6 7 8 9]
You can access elements from a 1-dimensional array in NumPy using the same sy...
Hands-on Numpy(II): Performing basic operations with ndarrays
In the last article, we learned many different ways in which we can create ndarrays. Now that we know how to create NumPy arrays it’s time to start playing around with them.
We will learn to perform basic operations, a task you can divide into 3 categories:
Operations between arrays of the same size.
Operations between an array and a scal...
Hands-on NumPy (I): Creating ndarrays
NumPy (an acronym for Numeric Python) is a library for working with multi-dimensional arrays and matrices. It was created in 2005 by Travis Oliphant, and since then received numerous contributions from the community that enabled it to grow into one of the most used tools in data science.
NumPy lets you manipulate huge arrays in a very performan...
Deep Learning Basics(11): Moving forward
We reached the end of our introductory journey in deep learning.
Now you understand what this is all about. Maybe you really like it and are ready to deepen your knowledge in the topic(deepen, deep learning, get it? 👀).
This will be a shorter article, I’ll just offer some pointers you can follow as next steps. Good, let’s get started!
...
Deep Learning Basics(10): Regularization
In the previous article we learned how to use Keras to build more powerful neural networks. Professional-grade libraries like Keras, Tensorflow, and Pytorch let you build neural networks that can learn intricate patterns and solve novel problems.
Deep-learning networks lets learn subtle patterns thanks to their inherently large hypothesis space...
Deep Learning Basics(9): Building networks using Keras
We already covered the most important deep learning concepts and created different implementations using vanilla Python. Now, we are in a position where we can start building something a bit more elaborate.
We’ll use a more hands-on approach by building a deep learning model for classification using production-grade software.
You will learn ho...
Deep Learning Basics(8): Intermediate layers and backpropagation
In the previous article we learned that neural networks look for the correlation between the inputs and the outputs of a training set. We also learned that based on the pattern, the weights will have an overall tendency to increase or decrease until the network predicts all the values correctly.
Sometimes there is not a clear direction in this ...
Deep Learning Basics(7): Correlation
In previous articles, we learned how neural networks adjust their weights to improve the accuracy of their predictions using techniques like gradient descent.
In this article, we will take a look at the learning process using a more abstract perspective. We will discuss the correlation between inputs and outputs in a training set, and how neura...
Deep Learning Basics(6): Generalized gradient descent (II)
In the previous article the foundations for a generalized implementation of gradient descent. Namely, cases with multiple inputs and one output, and multiple outputs and one input.
In this article, we will continue our generalization efforts to come up with a version of gradient descent that works with any number of inputs and outputs.
First, ...
Deep Learning Basics(5): Generalized gradient descent (I)
In the previous article, we learned about gradient descent with a simple 1-input/1-output network. In this article, we will learn how to generalize this technique for networks with any number of inputs and outputs.
We will concentrate on 3 different scenarios:
Gradient descent with on NNs with multiple inputs and a single output.
Gradient...
Quick tips: Integrating Google Analytics with Rails 6/5 + Turbolinks + Webpacker
Just including the script tags Google Analytics provides you is not enough to enable analytics when a user is navigating your app with Turbolinks.
We need to enable our app to send analytics to google every time Turbolinks loads, and I wanted to share the solution I came up with for this problem. Before we start, let me share a disclaimer:
...
Deep Learning Basics(4): Gradient Descent
In the previous article, we learned about hot/cold learning.
We also learned that hot/cold learning has some problems: it’s slow and prone to overshoot, so we need a better way of adjusting the weights.
A better approach should take into consideration how accurate our predictions are and adjust the weights accordingly. Predictions that are way...
Deep Learning Basics(3): Hot/Cold learning
In the previous articles, we learned how neural networks perform estimations: a weighted sum is performed between the network inputs and its weights. Until now, the values of those weights were given to us by a mysterious external force. We took for granted that those are the values that produce the best estimates.
Finding the right value for e...
Deep Learning Basics(2): Estimation
In the previous article we learned what a neural network is and how it performs predictions: the input is combined with knowledge (in the form of a weight value) to produce an output.
In practice, just one input and one weight are rarely of any use. Most systems in the real world are much more complex, so you will need networks that can handle ...
Deep Learning Basics(1): Introduction
So, deep learning. Have you heard about it? If you work in the tech sector, you probably have. Every week you see news about how people are using it to solve all sorts of interesting challenges.
Because of all the interest (and sometimes raw hype) around deep learning, you might believe that it’s some sort of revolutionary new technology. Well,...
Some final notes I couldn't fit in the other Pragmatic Thinking articles
This week’s article will be a bit shorter than usual. In the last month we discussed what I consider to be the most important topics on Pragmatic Learning and Thinking, by Andy Hunt.
We discussed things like the Dreyfus Model, L-mode/R-mode brain operation modes, better ways of learning and common biases that affect our judgement. In this artic...
Those little bugs in our brains
We tend to think of our brains as infallible logical machines with perfect memory and absolute rationality. We couldn’t be more wrong: it doesn’t matter how educated or intelligent we are, there are inherent flaws in our brains we just can’t get rid of.
This has nothing to do with your preparation or ability for rational thinking, usually, it’s...
103 post articles, 6 pages.