Please use this identifier to cite or link to this item:
https://hdl.handle.net/10356/181426
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Zhou, Yihan | en_US |
dc.contributor.author | Chen, Yan | en_US |
dc.contributor.author | Rao, Xuanming | en_US |
dc.contributor.author | Zhou, Yukang | en_US |
dc.contributor.author | Li, Yuxin | en_US |
dc.contributor.author | Hu, Chao | en_US |
dc.date.accessioned | 2024-12-02T04:47:20Z | - |
dc.date.available | 2024-12-02T04:47:20Z | - |
dc.date.issued | 2024 | - |
dc.identifier.citation | Zhou, Y., Chen, Y., Rao, X., Zhou, Y., Li, Y. & Hu, C. (2024). Leveraging large language models and BERT for log parsing and anomaly detection. Mathematics, 12(17), 12172758-. https://dx.doi.org/10.3390/math12172758 | en_US |
dc.identifier.issn | 2227-7390 | en_US |
dc.identifier.uri | https://hdl.handle.net/10356/181426 | - |
dc.description.abstract | Computer systems and applications generate large amounts of logs to measure and record information, which is vital to protect the systems from malicious attacks and useful for repairing faults, especially with the rapid development of distributed computing. Among various logs, the anomaly log is beneficial for operations and maintenance (O&M) personnel to locate faults and improve efficiency. In this paper, we utilize a large language model, ChatGPT, for the log parser task. We choose the BERT model, a self-supervised framework for log anomaly detection. BERT, an embedded transformer encoder, with a self-attention mechanism can better handle context-dependent tasks such as anomaly log detection. Meanwhile, it is based on the masked language model task and next sentence prediction task in the pretraining period to capture the normal log sequence pattern. The experimental results on two log datasets show that the BERT model combined with an LLM performed better than other classical models such as Deelog and Loganomaly. | en_US |
dc.language.iso | en | en_US |
dc.relation.ispartof | Mathematics | en_US |
dc.rights | © 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https:// creativecommons.org/licenses/by/ 4.0/). | en_US |
dc.subject | Computer and Information Science | en_US |
dc.title | Leveraging large language models and BERT for log parsing and anomaly detection | en_US |
dc.type | Journal Article | en |
dc.contributor.school | School of Computer Science and Engineering | en_US |
dc.identifier.doi | 10.3390/math12172758 | - |
dc.description.version | Published version | en_US |
dc.identifier.scopus | 2-s2.0-85203646702 | - |
dc.identifier.issue | 17 | en_US |
dc.identifier.volume | 12 | en_US |
dc.identifier.spage | 12172758 | en_US |
dc.subject.keywords | Anomaly log detection | en_US |
dc.subject.keywords | Large language models | en_US |
dc.description.acknowledgement | This research was sponsored in part by the National Natural Science Foundation of China (No. 62177046 and 62477046), Hunan 14th Five-Year Plan Educational Science Research Project (No. XJK23AJD022 and XJK23AJD021), Hunan Social Science Foundation (No. 22YBA012), Hunan Provincial Key Research and Development Project (No. 2021SK2022), and High Performance Computing Center of Central South University. | en_US |
item.fulltext | With Fulltext | - |
item.grantfulltext | open | - |
Appears in Collections: | SCSE Journal Articles |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
mathematics-12-02758-v2.pdf | 2.86 MB | Adobe PDF | ![]() View/Open |
Page view(s)
177
Updated on Feb 10, 2025
Download(s) 50
85
Updated on Feb 10, 2025
Google ScholarTM
Check
Altmetric
Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.