[1] |
谢玉鹏. 江西省公安机关警情数据采集与综合应用研究[D]. 南昌: 江西财经大学, 2017.
|
[2] |
汤小军, 宋华. 提升基层派出所处置复杂警情能力研究[J]. 河北公安警察职业学院学报, 2021, 21(1): 5-8.
|
[3] |
朱国富, 孙丽丽. 警情分类处置提效能创满意[N]. 人民公安报, 2022-09-20(006).
|
[4] |
公安部: 2023年共破获电信网络诈骗案件43. 7万起[J]. 中国防伪报道, 2024(3): 7-8.
|
[5] |
黄文婷. 公共秩序的界定: 基于香港判例的分析[J]. 港澳研究, 2022(1): 35-50.
|
[6] |
刘鹏. 多发性侵财犯罪的打击与预防对策[J]. 四川警察学院学报, 2015, 27(5): 26-32.
|
[7] |
张在翔. 数据挖掘技术在警情分析及预测上的应用[D]. 上海: 复旦大学, 2008.
|
[8] |
殷小科, 王威, 王婕, 等. 分层文本分类在警情数据中的应用[J]. 现代计算机, 2021, 27(23): 86-90.
|
[9] |
章磊, 王攀, 何芬. 自然语言处理在警情智能分析中的应用[J]. 警察技术, 2021(5): 39-43.
|
[10] |
程春惠, 何钦铭. 面向不均衡类别朴素贝叶斯犯罪案件文本分类[J]. 计算机工程与应用, 2009, 45(35): 126-128.
|
[11] |
张静, 高子信, 丁伟杰. 基于BERT-DPCNN的警情文本分类研究[J/OL]. 数据分析与知识发现, 1-15[2025-01-02]. http://kns.cnki.net/kcms/detail/10.1478.G2.20240313.1318.008.html.
|
[12] |
王孟轩, 张胜, 王月, 等. 改进的CRNN模型在警情文本分类中的研究与应用[J]. 应用科学学报, 2020, 38(3): 388-400.
|
[13] |
DEVLIN J, CHANG M W, LEE K, et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding[J]. arxiv preprint arxiv:1810.04805.
|
[14] |
CUI Y, CHE W, LIU T, et al. Revisiting pre-trained models for Chinese natural language processing[J]. arxiv preprint arxiv:2004.13922.
|
[15] |
SUN C, QIU X, XU Y, et al. How to fine-tune bert for text classification?[C]// Chinese computational linguistics:18th China national conference, CCL 2019, Springer International Publishing, 2019: 194-206.
|
[16] |
JEREMY H, SEBASTIAN R. Universal Language Model Fine-tuning for Text Classification[J]. arxiv preprint arxiv:1801.06146.
|
[17] |
KIM Y. Convolutional Neural Networks for Sentence Classification[C]// Alessandro Moschitti, Bo Pang, Walter Daelemans. Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP). Doha, Qatar: Association for Computational Linguistics, 2014: 1746-1751.
|
[18] |
HARTMANN J, HUPPERTZ J, SCHAMP C, et al. Comparing automated text classification methods[J]. International Journal of Research in Marketing, 2019, 36(1): 20-38.
|
[19] |
ZHAO W X, ZHOU K, LI J, et al. A Survey of Large Language Models[J]. arXiv preprint arXiv:2303.18223.
|
[20] |
THOPPILAN R, DE FREITAS D, HALL J, et al. LAMDA: Language Models for Dialog Applications[J]. arXiv preprint arXiv:2201.08239.
|
[21] |
ZHANG C, YANG Z, HE X, et al. Multimodal intelligence: Representation learning, information fusion, and applications[J]. IEEE Journal of Selected Topics in Signal Processing, 2020, 14(3): 478-493.
|
[22] |
GAO J, LI P, CHEN Z, et al. A Survey on Deep Learning for Multimodal Data Fusion[J]. Neural Computation, 2020, 32: 829-864.
|
[23] |
LIU Z, WANG Y, VAIDYA S, et al. Kan: Kolmogorov-Arnold Networks[J]. arxiv preprint arxiv:2404.19756.
|
[24] |
CUI Y, CHE W, WANG S, et al. Lert: A linguistically-motivated pre-trained language model[J]. arxiv preprint arxiv:2211.05344.
|
[25] |
LAI S, XU L, LIU K, et al. Recurrent convolutional neural networks for text classification[C]// Proceedings of the AAAI conference on artificial intelligence. 2015, 29(1).
|
[26] |
YAMASHITA R, NISHIO M, DO R K, et al. Convolutional neural networks: an overview and application in radiology[J]. Insights into imaging, 2018, 9: 611-629.
|
[27] |
MUSTAPHA R, ZGALLAI W A, OZSAHIN D U, et al. Chapter 2-Convolution neural network and deep learning[J]. Artificial Intelligence and Image Processing in Medical Imaging, 2024: 21-50.
|
[28] |
HOCHREITER S, SCHMIDHUBER J. Long short-term memory[J]. Neural Computation, 1997, 9(8): 1735-1780.
|
[29] |
STAUDEMEYER R C, MORRIS E R. Understanding LSTM-A Tutorial into Long Short-Term Memory Recurrent Neural Networks[J]. arxiv preprint arxiv:1909.09586.
|
[30] |
MAHTO D, YADAV S C. Hierarchical Bi-LSTM based emotion analysis of textual data[J]. Bulletin of the Polish Academy of Sciences Technical Sciences, 2022: e141001.
|
[31] |
TA H T. BSRBF-KAN: A Combination of B-splines and Radial Basic Functions in Kolmogorov-Arnold Networks[J]. arxiv preprint arxiv:2406.11173.
|