Explanation:
In operating systems, to increase response time and throughput, the kernel minimizes the frequency of disk access by using a buffer cache. A buffer cache is an area of memory that holds data temporarily while it is being transferred between two locations, such as between disk storage and the operating system. By keeping frequently accessed data in memory, it reduces the need for disk accesses, thus improving performance.
Other Options:
- (A) Pooling β This generally refers to a method of managing resources like memory or threads in a pool, but it is not specifically related to disk access or response time in this context.
- (B) Spooling β Spooling is a process in which data is temporarily stored in a buffer (usually on a disk) to be processed later. It is mainly used for managing print jobs or similar tasks, not directly related to disk access minimization.
- (D) Swapping β Swapping refers to the process of moving data between RAM and disk storage when memory is full, but it is not directly about minimizing disk access.
Thus, (C) Buffer cache is the correct answer.