Hadoop Distributed File System (HDFS) is a scalable, fault-tolerant file system designed to run on commodity hardware. It is a key component of the Apache Hadoop framework and is engineered to handle large volumes of data, making it ideal for big data applications.