Tpram-kelly.7z | Fresh & Genuine
: A preprint or abstract of the work is hosted on ResearchGate .
: It uses a Transformer-based attention mechanism to build a performance prediction model for microservice nodes on a system's "critical path". TpRam-Kelly.7z
: The official journal publication is available at Springer Link . : A preprint or abstract of the work
The file name is a shorthand for the framework (Transformer-based Prediction and Resource Adaption Method) and likely one of its primary authors or a related contributor, such as Yang Chen or Hongyan Xia (whose research is often associated with these models). Paper Summary: TPRAM The file name is a shorthand for the
You can find the full text or official citation through these platforms:
: Experimental results using the DeathStarBench benchmark showed that TPRAM can save at least 40.58% of CPU and 15.84% of memory resources while maintaining end-to-end Quality of Service (QoS). Accessing the Paper
The file refers to the research paper titled " Transformer-based performance prediction and proactive resource allocation for cloud-native microservices ," published in Cluster Computing in August 2025.