Optimizing File System Access Time through Intelligent Caching

Authors

  • Rohini kumari Noida institute of engineering and technology, Greater Noida Author
  • Prince Kumar Author
  • Rohan Kumar Author
  • Ajit kumar Author

Keywords:

Storage, File System, Intelligent Caching, Data Access, Performance Optimization

Abstract

File systems form the foundation of modern computing, yet data access latency remains a persistent performance bottleneck. While processors continue to advance rapidly, storage access speeds have not kept pace, leading to delays that directly affect application responsiveness. Traditional caching techniques such as Least Recently Used (LRU) and Least Frequently Used (LFU) offer limited benefits in dynamic and unpredictable workloads because they rely on fixed heuristics. This work presents an intelligent caching approach that continuously observes file access patterns and uses predictive models to anticipate future data requests. By dynamically adapting cache contents and size based on real-time workload characteristics, the proposed system prioritizes frequently and imminently accessed data, thereby reducing access latency and improving cache hit ratios. Experimental observations on synthetic workloads indicate that intelligent caching significantly outperforms conventional caching strategies in terms of average access time and adaptability, particularly under mixed and changing workloads. The study also discusses implementation considerations, overhead challenges, and real-world applicability, highlighting intelligent caching as a promising solution for next-generation file system performance optimization.

Downloads

Published

2025-12-23

Issue

Section

Articles