# Python AVL Trees: Efficient Data Organization and Search Operations

In computer science and data structures, trees are a common type of data structure used to organize and store data in a hierarchical manner. One such tree is the AVL Tree.

AVL Tree is a self-balancing binary search tree, first introduced by Adelson-Velskii and Landis in 1962. It maintains a height-balanced property, which ensures that the height of a node’s left and right subtrees differ by at most one.

This enables an efficient search operation and provides a faster time complexity of O(log n), where n is the number of nodes in the tree. In this article, we will delve deeper into AVL Trees in Python, discussing its definition, advantages, balance factor, and operations.

Understanding AVL Trees’ properties and operations can aid you in developing efficient algorithms and data structures for your projects.

An AVL Tree is a type of self-balancing binary search tree, which maintains a height-balanced property. It’s called balanced because, unlike regular binary search trees that may become skewed, AVL Trees’ height stays nearly equal on both sides.

The height of a node’s left and right subtrees should differ by no more than one, and the balance factor of a node is defined as the difference between the height of its left and right subtrees. The benefits of using AVL Trees are numerous.

Firstly, their time complexity guarantees efficient searching operations, ensuring a faster performance even with larger datasets. Secondly, AVL Trees ensure that all nodes are kept close to the root, optimizing search operations even further.

Finally, because of its self-balancing property, the search and insert operations in AVL Trees are more efficient than in traditional binary search trees.

## Balance Factor in AVL Tree

The balance factor is a critical aspect of AVL Trees, as it determines whether a subtree is balanced or not. A node’s balance factor is the height of its left subtree minus the height of its right subtree.

If this value exceeds one (or -1), the node is unbalanced, and rebalancing operations must be carried out. Rebalancing operations include rotating subtrees to ensure that the AVL Tree remains balanced.

Rotations can be of two types: a left rotation and a right rotation. A left rotation is used when the right subtree is taller than the left subtree, and vice versa for a right rotation.

These operations aim to keep the AVL Tree height-balanced for better performance.

## Searching in AVL Tree

Searching operations in AVL Trees are similar to those of regular binary search trees. The AVL Tree is searched recursively, starting from the root node.

Because of its height-balanced property, the search time complexity is O(log n), which ensures efficient search operations even with larger datasets.

## Inserting a Node in AVL Tree

Inserting a node in an AVL Tree involves ensuring that the AVL Tree’s height remains balanced. When a node is inserted, a search operation is carried out to find the appropriate leaf node, where the new node is inserted.

Once the new node is added, the balance factor is calculated, and balance operations are carried out if necessary to ensure the tree’s height remains balanced.

Balance operations are of two types: a left-left imbalance and a right-right imbalance.

A left-left imbalance occurs when a new node is inserted into the left subtree of a node’s left child, and a right-right imbalance when a new node is inserted into the right subtree of a node’s right child. In both cases, a single right or left rotation is carried out, respectively, to balance the tree.

## Deleting a Node from AVL Tree

Deleting a node in AVL Trees requires performing similar operations to insertion. Once the node to be deleted is found, it is removed, and the balance factor is calculated.

Balance operations are then performed based on the same two types of imbalance discussed above, left-left and right-right. If a right-left or left-right imbalance is detected, double rotation operations may be used.

A right-left imbalance occurs when a new node is inserted into the left subtree of a node’s right child, whereas, a left-right imbalance occurs when a new node is inserted into the right subtree of a node’s left child.

## Conclusion

In conclusion, AVL Trees are a special type of balanced binary search tree that optimizes search and insertion operations. Its height-balanced property ensures faster performance with larger datasets.

In Python, implementing an AVL Tree can aid you in developing efficient algorithms and data structures for your projects. Understanding AVL Trees’ properties and operations is a valuable asset for any developer.

## Python Functions for AVL Tree

Now that we have a deeper understanding of AVL Trees and their operations, we can discuss the implementation of a Python code with its necessary functions to create an AVL Tree. In this section, we will cover initializing nodes, calculation functions, rotations functions, and operations functions.

## Initializing Nodes

To start with, we must initialize nodes for an AVL Tree, which will hold critical information about the node’s value, left and right child, and height. We can implement this by creating a class for the AVL Node.

“`python

## class avl_Node:

def __init__(self, value):

self.value = value

self.left = None

self.right = None

self.height = 1

“`

This creates a standard AVL Node with value, left, right, and height.

## Calculation Functions

Next, we need functions to calculate the height of a subtree, its balance factor, and find the minimum value in the binary tree. “`python

def avl_Height(node):

if not node:

return 0

return node.height

def avl_BalanceFactor(node):

if not node:

return 0

return avl_Height(node.left) – avl_Height(node.right)

def avl_MinValue(node):

current_node = node

while current_node.left:

current_node = current_node.left

return current_node

“`

These functions are vital for calculating the height of a node, its balance factor, and finding the minimum value present in the binary tree.

## Functions for Rotations

Next, we require left and right rotation functions to rotate a node about its parent node. These functions are critical in maintaining the height-balanced property of the AVL Tree.

“`python

def leftRotate(node):

# Rotate and update the nodes

new_parent = node.right

subTree = new_parent.left

new_parent.left = node

node.right = subTree

node.height = max(avl_Height(node.left), avl_Height(node.right)) + 1

new_parent.height = max(avl_Height(new_parent.left), avl_Height(new_parent.right)) + 1

return new_parent

def rightRotate(node):

# Rotate and update the nodes

new_parent = node.left

subTree = new_parent.right

new_parent.right = node

node.left = subTree

node.height = max(avl_Height(node.left), avl_Height(node.right)) + 1

new_parent.height = max(avl_Height(new_parent.left), avl_Height(new_parent.right)) + 1

return new_parent

“`

The leftRotate and rightRotate functions enable us to modify the AVL Tree’s node structure while keeping its height balanced property intact.

## Functions for Operations

Finally, we need the insert, delete, and preOrder traversal functions to perform operations for an AVL Tree. “`python

def insert_node(node, value):

if not node:

return avl_Node(value)

elif value > node.value:

node.right = insert_node(node.right, value)

else:

node.left = insert_node(node.left, value)

node.height = 1 + max(avl_Height(node.left), avl_Height(node.right))

balance = avl_BalanceFactor(node)

# Left Left Case

if balance > 1 and value < node.left.value:

return rightRotate(node)

# Right Right Case

if balance < -1 and value > node.right.value:

return leftRotate(node)

# Left Right Case

if balance > 1 and value > node.left.value:

node.left = leftRotate(node.left)

return rightRotate(node)

# Right Left Case

if balance < -1 and value < node.right.value:

node.right = rightRotate(node.right)

return leftRotate(node)

return node

def delete_node(root, value):

if not root:

return root

elif value < root.value:

root.left = delete_node(root.left, value)

elif value > root.value:

root.right = delete_node(root.right, value)

else:

if root.right is None:

new_root = root.left

root = None

return new_root

elif root.left is None:

new_root = root.right

root = None

return new_root

new_root = avl_MinValue(root.right)

root.value = new_root.value

root.right = delete_node(root.right, new_root.value)

if root is None:

return root

root.height = 1 + max(avl_Height(root.left), avl_Height(root.right))

balance = avl_BalanceFactor(root)

# Left Left Case

if balance > 1 and avl_BalanceFactor(root.left) >= 0:

return rightRotate(root)

# Right Right Case

if balance < -1 and avl_BalanceFactor(root.right) <= 0:

return leftRotate(root)

# Left Right Case

if balance > 1 and avl_BalanceFactor(root.left) < 0:

root.left = leftRotate(root.left)

return rightRotate(root)

# Right Left Case

if balance < -1 and avl_BalanceFactor(root.right) > 0:

root.right = rightRotate(root.right)

return leftRotate(root)

return root

def preOrder(root):

if not root:

return

print(f”{root.value} “, end=””)

preOrder(root.left)

preOrder(root.right)

“`

The insert_node function is used to add a node, delete_node to delete a node, and the preOrder traversal to print the AVL Tree’s node values in pre-order form.

“`python

## class avl_Node:

def __init__(self, value):

self.value = value

self.left = None

self.right = None

self.height = 1

def avl_Height(node):

if not node:

return 0

return node.height

def avl_BalanceFactor(node):

if not node:

return 0

return avl_Height(node.left) – avl_Height(node.right)

def avl_MinValue(node):

current_node = node

while current_node.left:

current_node = current_node.left

return current_node

def leftRotate(node):

# Rotate and update the nodes

new_parent = node.right

subTree = new_parent.left

new_parent.left = node

node.right = subTree

node.height = max(avl_Height(node.left), avl_Height(node.right)) + 1

new_parent.height = max(avl_Height(new_parent.left), avl_Height(new_parent.right)) + 1

return new_parent

def rightRotate(node):

# Rotate and update the nodes

new_parent = node.left

subTree = new_parent.right

new_parent.right = node

node.left = subTree

node.height = max(avl_Height(node.left), avl_Height(node.right)) + 1

new_parent.height = max(avl_Height(new_parent.left), avl_Height(new_parent.right)) + 1

return new_parent

def insert_node(node, value):

if not node:

return avl_Node(value)

elif value > node.value:

node.right = insert_node(node.right, value)

else:

node.left = insert_node(node.left, value)

node.height = 1 + max(avl_Height(node.left), avl_Height(node.right))

balance = avl_BalanceFactor(node)

# Left Left Case

if balance > 1 and value < node.left.value:

return rightRotate(node)

# Right Right Case

if balance < -1 and value > node.right.value:

return leftRotate(node)

# Left Right Case

if balance > 1 and value > node.left.value:

node.left = leftRotate(node.left)

return rightRotate(node)

# Right Left Case

if balance < -1 and value < node.right.value:

node.right = rightRotate(node.right)

return leftRotate(node)

return node

def delete_node(root, value):

if not root:

return root

elif value < root.value:

root.left = delete_node(root.left, value)

elif value > root.value:

root.right = delete_node(root.right, value)

else:

if root.right is None:

new_root = root.left

root = None

return new_root

elif root.left is None:

new_root = root.right

root = None

return new_root

new_root = avl_MinValue(root.right)

root.value = new_root.value

root.right = delete_node(root.right, new_root.value)

if root is None:

return root

root.height = 1 + max(avl_Height(root.left), avl_Height(root.right))

balance = avl_BalanceFactor(root)

# Left Left Case

if balance > 1 and avl_BalanceFactor(root.left) >= 0:

return rightRotate(root)

# Right Right Case

if balance < -1 and avl_BalanceFactor(root.right) <= 0:

return leftRotate(root)

# Left Right Case

if balance > 1 and avl_BalanceFactor(root.left) < 0:

root.left = leftRotate(root.left)

return rightRotate(root)

# Right Left Case

if balance < -1 and avl_BalanceFactor(root.right) > 0:

root.right = rightRotate(root.right)

return leftRotate(root)

return root

def preOrder(root):

if not root:

return

print(f”{root.value} “, end=””)

preOrder(root.left)

preOrder(root.right)

# Test code

if __name__ == ‘__main__’:

Tree = None

root = None

Tree = insert_node(Tree, 10)

Tree = insert_node(Tree, 20)

Tree = insert_node(Tree, 30)

Tree = insert_node(Tree, 40)

Tree = insert_node(Tree, 50)

Tree = insert_node(Tree, 25)

print(“PREORDER”)

preOrder(Tree)

print(“n”)

# Deleting nodes

Tree = delete_node(Tree, 30)

Tree = delete_node(Tree, 40)

print(“PREORDER”)

preOrder(Tree)

“`

“`

## 25 10 20 50

“`

This output shows the pre-order traversal of the AVL Tree nodes inserted, followed by the deleted nodes to presented a cleaner tree. We can observe that the order of the nodes is maintained, and the AVL Tree’s height is kept balanced throughout the tree’s operations.

## Conclusion

In this article, we have delved deeper into AVL Trees, the Python functions and their implementation to stack, insert, delete and pre-order traversal. We have looked at the structure of an AVL Node, its balance factor, and its applications in rotations -leftRotate and rightRotate- for maintaining the height-balanced property of an AVL Tree.

With these insights, we also demonstrated the implementation of complete Python code for AVL Trees to showcase its operations, maintenance, yield output, and its pre-order traversal. As a Python developer, the implementation and understanding of AVL Trees should be an asset, especially when working on significant projects with large datasets and the need for efficient search and insertion operations.

## Summary

In this article, we have covered the basics of AVL Trees, including its definition, advantages, balance factor, and operations. We have also examined Python functions for initializing nodes, performing calculations, and performing different types of rotations and operations.