# Mastering the Tanh Activation Function: Definition Graph and Applications

## Introduction to tanh Activation Function

When it comes to creating deep learning neural networks, choosing the right activation function is essential. Activation functions determine the output of a neural network, allowing it to model complex and non-linear relationships between inputs and outputs.

In this article, we’ll dive into the details of the tanh (tangent hyperbolic) activation function, discussing its definition, equation, and how it compares to other activation functions like the sigmoid.

Activation Functions: Definition and Types

Activation functions are mathematical equations that allow neural networks to model complex non-linear relationships between input values and output predictions.

When the input values are fed into the neural network, activation functions determine how the output is generated by applying some sort of transformation to the weighted sum of input values. There are two main types of activation functions: linear and non-linear.

Linear activation functions simply perform a linear transformation of the input data, producing output values proportional to the input values. However, linear activation functions are not well-suited for modeling non-linear relationships, as they cannot capture the complex interactions between inputs and outputs.

This is where non-linear activation functions like tanh come into play. Tanh Activation Function: Definition and Overview

The tanh activation function is a non-linear function that maps input values to output values between -1 and 1, making it well-suited for neural networks that require outputs within this range.

## The tanh function is defined as follows:

tanh(x) = (e^x – e^-x) / (e^x + e^-x)

Here, e represents Euler’s number, which is a mathematical constant equal to 2.718. The tanh function is derived from the hyperbolic tangent function, which is used in math to study hyperbolic geometry.

## Tanh Activation Function vs Sigmoid Activation Function

The sigmoid activation function is another popular non-linear function used in neural networks. The sigmoid function maps input values to output values between 0 and 1, making it well-suited for models that require a binary output, such as classification problems.

The primary difference between the tanh and sigmoid functions is the range of output values they produce. While sigmoid produces values between 0 and 1, tanh produces values between -1 and 1.

This means that the tanh function is more effective for models that require a symmetric activation across both positive and negative values. In addition, the tanh function’s derivative has a steeper slope near 0, making it more effective for gradient descent and backpropagation methods.

Gradient descent is an optimization algorithm used to find the minimum of a function, while backpropagation is an algorithm for training deep neural networks.

## Conclusion

In conclusion, the tanh activation function is a powerful tool in the arsenal of deep learning practitioners. It’s a non-linear function that maps input values to output values between -1 and 1, making it well-suited for models requiring a symmetric activation function with outputs across both positive and negative values.

While the sigmoid function remains a popular alternative, the steeper learning curve and wider range of output values of the tanh function make it a preferred option for many deep learning applications. With this information, neural network developers can make informed decisions about which activation function to use when building deep learning models.

## Graph of tanh Activation Function

Visualizing activation functions such as the tanh function is an essential part of understanding how they work. In this section, we discuss how to create a graph of the tanh activation function using Matplotlib and explore the output of the function, which is a zero-centered output.

We also discuss the relationship between the tanh and sigmoid activation functions and how it can be used to plot the graph of the tanh function.

## Creating a Tanh Graph Using Matplotlib

Matplotlib is a popular data visualization library in Python that allows us to create graphs of mathematical functions like the tanh function. To create a graph of the tanh function, we first need to import the Matplotlib library and the numpy module, which provides support for mathematical operations in Python.

We then create a range of values for x, create an array of corresponding y values using the tanh function, and plot the graph using the Matplotlib plot() function. Here’s the Python code for creating a graph of the tanh function between -10 and 10:

“`

## import numpy as np

import matplotlib.pyplot as plt

x = np.linspace(-10, 10, 1000)

y = np.tanh(x)

plt.plot(x, y)

plt.title(‘Graph of tanh Function’)

plt.xlabel(‘x’)

plt.ylabel(‘y’)

plt.grid()

plt.show()

“`

## This will create a graph of the tanh function that looks like the following:

![Tanh Graph Image](https://github.com/aimakerspace/python-for-deep-learning-course/blob/main/assets/tanh_graph.png)

## Output of Tanh Function

The output of the tanh function ranges from -1 to 1, making it a zero-centered output. This means that the output of the function has a mean of zero, which is ideal for optimizing deep learning models using gradient descent and backpropagation algorithms.

The zero-centered output of the tanh function is achieved by subtracting the mean of the function’s output from each output value. This results in a range of values centered around zero, with negative values for inputs below zero and positive values for inputs above zero.

## Using the Relationship Between Tanh and Sigmoid Activation Functions to Plot the Graph

The tanh activation function is closely related to the sigmoid activation function, as we mentioned earlier. In fact, the tanh function can be derived from the sigmoid function using the following formula:

tanh(x) = 2 * sigmoid(2x) – 1

This relationship can be used to plot the graph of the tanh function using the graph of the sigmoid function as a template.

By scaling the sigmoid function’s graph, we can create a graph of the tanh function with the same characteristics. Here’s the Python code for creating a graph of the tanh function using the sigmoid function’s graph:

“`

## import numpy as np

import matplotlib.pyplot as plt

x = np.linspace(-10, 10, 1000)

y_sigmoid = 1 / (1 + np.exp(-x))

y_tanh = 2 * y_sigmoid – 1

plt.plot(x, y_tanh)

plt.title(‘Graph of tanh Function’)

plt.xlabel(‘x’)

plt.ylabel(‘y’)

plt.grid()

plt.show()

“`

## This will create a graph of the tanh function that looks like the following:

![Tanh-Sigmoid Graph Image](https://github.com/aimakerspace/python-for-deep-learning-course/blob/main/assets/tanh_sigmoid_graph.png)

## Applications of Tanh Activation Function

The tanh activation function is used in a variety of deep learning applications, including natural language processing (NLP) and speech recognition. In NLP, the tanh activation function is often used in recurrent neural networks (RNNs), which are specialized models used to process sequence data, like text and speech.

One example of an NLP model using tanh activation functions is the Long Short-Term Memory (LSTM) network. LSTMs are a type of RNN that use gated recurrent units (GRUs) to selectively retain or discard specific information from previous states.

The tanh function is used in the LSTM’s GRU to bound the output to a value between -1 and 1, ensuring the output remains within a manageable range. Another example where the tanh activation function can be used is in speech recognition in combination with other deep learning models like convolutional neural networks (CNNs).

CNNs are used to extract features from raw audio, which are then fed into an RNN with tanh activation functions to generate a transcription of the spoken words.

## Conclusion

The tanh activation function is an important tool in deep learning models, and its zero-centered output, alongside its range of values between -1 and 1, make it an ideal function for many neural network applications. Visualizing the function can be achieved using Matplotlib and numpy, and with its relation to sigmoid, plotting the function is quite simple.

Finally, we learned about some of the applications where tanh activation functions are used, particularly in NLP and speech recognition models using RNNs and CNNs. All of this information can help deep learning practitioners fine-tune their models for better performance across a range of different applications.

## Summary

Neural networks are an essential tool in deep learning, and activation functions like the tanh function are critical to their operation. In this article, we discussed the definition and types of activation functions, focusing on the non-linear tanh function and its characteristics.

We also explored how to create a graph of the tanh function using Matplotlib, its zero-centered output, and its relation to the sigmoid function. Lastly, we discussed several applications of the tanh function, especially in natural language processing and speech recognition using RNNs and CNNs.

## Recap of Main Topics and Subtopics

1. to tanh Activation Function

– Definition of activation functions

– Non-linear vs linear activation functions

-to tanh activation function

2.

## Equation and Relation with Sigmoid Activation Function

– Equation of tanh activation function

– Relation between tanh and sigmoid activation functions

3.

## Graph of tanh Activation Function

– Creating a tanh graph using Matplotlib

– Output of tanh function

– Using the relation between tanh and sigmoid activation functions to plot the graph

4.

## Applications of tanh Activation Function

– Use of tanh in NLP applications

In the introduction, we provided an overview of activation functions, explaining their purpose and highlighting the difference between linear and non-linear functions. Then, we focused on the tanh activation function, providing an equation for it and discussing its relationship with the sigmoid function.

We explained how the tanh function produces output values between -1 and 1, making it a zero-centered output and how neural network developers can use this function to optimize their models using gradient descent and backpropagation algorithms. In the next section, we talked about how to create a graph of the tanh function using Matplotlib, a popular data visualization library in Python.

We also explored the details of the function’s output, specifically its zero-centered nature and its relation to the sigmoid function. By highlighting the similarities between the two functions, we showed how the sigmoid function can be used to create the tanh function’s graph, providing real-life examples of how developers can use this information to visualize activation functions and improve their deep learning models.

Finally, we discussed several applications of the tanh activation function, particularly in natural language processing and speech recognition using RNNs and CNNs. We gave examples of modules like LSTM and illustrated how these algorithms harness the power of tanh activation functions to achieve state-of-the-art results in language processing and speech recognition. In conclusion, this article has provided deep learning practitioners with an overview of the tanh activation function, its graph, and the underlying math behind it.

Along with practical applications of these concepts, it can help in understanding the role of activation functions in building deep neural networks. In conclusion, the tanh activation function is a vital tool in the deep learning toolbox, offering a zero-centered output and a range of -1 to 1 values that makes it exceptionally well-suited for neural network models.

This article has explored the definition and types of activation functions and focused on non-linear functions like the tanh. We’ve explained how to create a graphical representation of the function using Matplotlib, its relevance in deep learning optimization, and its relationship with the sigmoid function.

Furthermore, we have touched upon its use in natural language processing and speech recognition using RNNs and CNNs, revealing its practical applications. The takeaway is that in-depth understanding of the tanh function is essential to optimizing and improving deep neural network models, and we hope that the information provided has been useful to our readers.