Tensor Indexing in Tensorflow Last Updated : 24 Apr, 2025 Comments Improve Suggest changes Like Article Like Report In the realm of machine learning and deep learning, tensors are fundamental data structures used to represent numerical data with multiple dimensions. TensorFlow, a powerful numerical computation library, equips you with an intuitive and versatile set of operations for manipulating and accessing data within these tensors. Understanding tensor indexing in TensorFlow becomes crucial for navigating data effectively and building machine learning models. Tensor in TensorFlowIn TensorFlow, a tensor is a fundamental data structure used to represent multi-dimensional arrays or tensors. Tensors are the primary building blocks of TensorFlow computations, serving as the basic units for storing and manipulating data. Tensors can be Scalars (0-dimensional tensors)Vectors (1-dimensional tensors)Matrices (2-dimensional tensors)higher-dimensional arrays.Characteristics of tensors in TensorFlowShape: Tensors have a shape that determines their dimensions and size along each axis. For example, A 2x3 matrix has the shape (2, 3), which represents two rows and three columns.Data Type: Tensors have a data type, which describes the type of data stored in its elements. Common data types are float32, int32, and string.Rank: The rank of a tensor indicates the number of dimensions it holds. For example:A scalar tensor (a single value) has a rank of 0.A vector tensor (a 1-dimensional array) has a rank of 1.A matrix tensor (a 2-dimensional array) has a rank of 2Axis: Axis refers to a dimension of a tensor. In TensorFlow, tensors can have multiple axes, each representing a different dimension of the data.For Example:A 2D tensor has two axes: rows and columns (0 and 1).A 3D tensor has three axes (0 for height, 1 for width, and 2 for depth) (e.g., the first axis typically refers to rows in matrices).Values: Tensors contain values or elements stored in a contiguous memory block. These values can be accessed and manipulated using various TensorFlow operationsTensor Indexing in TensorFlowTensor indexing is the process of accessing and manipulating certain elements or subsets of a tensor. Tensor indexing, similar to the array indexing in Python, allows us to extract specific pieces or slices of data from a tensor. This is required for several TensorFlow operations, including data preparation, feature extraction, and model assessment. Accessing or modifying specific elements within these tensors becomes crucial for effective manipulation. This process, known as tensor indexing, mirrors how you interact with lists or strings in Python or arrays in NumPy. TensorFlow offers specialized operations like tf.slice, tf.gather, tf.gather_nd, and tf.scatter_nd to empower you with various tensor indexing tasks. Delving deeper, we'll explore how to leverage these operations for slicing, extracting, and inserting data within your tensors. This article explores various ways to index tensors: 1. SlicingSlicing in TensorFlow lets you grab specific sections of your data, just like picking out a slice of cake! It's useful for extracting smaller vectors, matrices, or even higher-dimensional chunks from your tensors. Slice It Up with tf.slice:This operation takes three ingredients: The main ingredient: the tensor you want to slice.Starting point: where you want to begin the slice (like choosing your first bite).Size: how much you want to take (a small, dainty slice or a generous piece?).Imagine a delicious tensor t1 full of numbers: Python3 import tensorflow as tf t1 = tf.constant([0, 1, 2, 3, 4, 5, 6, 7]) t1 Output: <tf.Tensor: shape=(8,), dtype=int32, numpy=array([0, 1, 2, 3, 4, 5, 6, 7], dtype=int32)>Want to grab a slice from index 1 to 3 (excluding 3)? Use tf.slice: Python3 s1 = tf.slice(t1, begin=[1], size=[3]) print(s1) # Output: [1 2 3]