ETRI-Knowledge Sharing Plaform

KOREAN
논문 검색
Type SCI
Year ~ Keyword

Detail

Conference Paper Design of Cache backend using Remote Memory for Network File System
Cited 3 time in scopus Download 3 time Share share facebook twitter linkedin kakaostory
Authors
Eun-Ji Lim, Shin-Young Ahn, Young-Ho Kim, Gyu-Il Cha, Wan Choi
Issue Date
2017-07
Citation
International Conference on High Performance Computing and Simulation (HPCS) 2017, pp.864-869
Language
English
Type
Conference Paper
DOI
https://dx.doi.org/10.1109/HPCS.2017.131
Project Code
17HS1900, Development of HPC System for Accelerating Large-scale Deep Learning, Choi Wan
Abstract
In supporting high-performance data processing, performance gap between the computation device and storage prevents the full utilization of the computation resource and causes a system bottleneck. In addition, some big-data applications which require interactive, real-time, and complicated computation need faster data I/O than distributed file systems. So we propose a new cache backend facility called CacheDM for network file system, which utilizes the distributed memory as a cache media in the cluster environment where computing nodes are connected via high speed network. CacheDM can provide low-latency and high-speed cache by supporting direct memory copy based access to the cached data by using RDMA. CacheDM is designed as a cache backend of the FS-Cache, therefore users can use CacheDM and gain its performance advantage without modification of existing application and NFS.
KSP Keywords
Big Data, Computation resource, Data processing, Distributed File system, High performance, High speed network, Low latency, Memory copy, Network file system(NFS), Performance data, Real-Time