Computer systems and information technologies https://csitjournal.khmnu.edu.ua/index.php/csit <div class="additional_content"> <p><strong><span class="VIiyi" lang="uk"><span class="JLqJ4b" data-language-for-alternatives="uk" data-language-to-translate-into="en" data-phrase-index="0">ISSN </span></span></strong><span class="VIiyi" lang="uk"><span class="JLqJ4b" data-language-for-alternatives="uk" data-language-to-translate-into="en" data-phrase-index="0">2710-0766</span></span><strong><span class="VIiyi" lang="uk"><span class="JLqJ4b" data-language-for-alternatives="uk" data-language-to-translate-into="en" data-phrase-index="0"><br /></span></span></strong></p> <p><span class="VIiyi" lang="uk"><span class="JLqJ4b" data-language-for-alternatives="uk" data-language-to-translate-into="en" data-phrase-index="0"><strong>ISSN</strong> 2710-0774 (online)</span></span></p> <p><strong>Published</strong> from the year 2020.</p> <p><strong>Publisher:</strong> <a title="Khmelhitsky National University" href="https://www.khnu.km.ua" target="_blank" rel="noopener">Khmelhytskyi National University (Ukraine)</a><a href="http://www.pollub.pl/">,</a></p> <p><strong>Frequency:</strong> 4 times a year</p> <p><strong>Manuscript languages:</strong> English</p> <p><strong>Editors:</strong> <a href="http://ki.khnu.km.ua/team/govorushhenko-tetyana/" target="_blank" rel="noopener">T. Hovorushchenko (Ukraine, Khmelnitskiy),</a></p> <p><strong>Certificate of state registration of print media:</strong> Series КВ № 24512-14452Р (20.07.2020).</p> <p><strong>Registration in Higher Attestation Commission of Ukraine:</strong> in processing</p> <p><strong>License terms:</strong> authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a <a href="http://creativecommons.org/licenses/by/4.0/" target="_blank" rel="noopener">Creative Commons Attribution License International CC-BY</a> that allows others to share the work with an acknowledgment of the work's authorship and initial publication in this journal.</p> <p><strong>Open-access Statement:</strong> journal Problems of Тribology provides immediate <a href="https://en.wikipedia.org/wiki/Open_access" target="_blank" rel="noopener">open access</a> to its content on the principle that making research freely available to the public supports a greater global exchange of knowledge. Full-text access to scientific articles of the journal is presented on the official website in the <a href="http://tribology.khnu.km.ua/index.php/ProbTrib/issue/archive" target="_blank" rel="noopener">Archives</a> section.</p> <p><strong>Address:</strong> International scientific journal “Computer Systems and Information Technologies Journal”, Khmelnytsky National University, Institutskaia str. 11, Khmelnytsky, 29016, Ukraine.</p> <p><strong>Tel.:</strong> +380951122544.</p> <p><strong>E-mail:</strong> <a href="mailto:csit.khnu@gmail.com">csit.khnu@gmail.com</a>.</p> <p><strong>Website:</strong> <a href="http://csitjournal.khmnu.edu.ua" target="_blank" rel="noopener">http://csitjournal.khmnu.edu.ua</a>.</p> </div> en-US csit.khnu@gmail.com (Говорущенко Тетяна Олександрівна) csit.khnu@gmail.com (Лисенко Сергій Миколайович) Tue, 30 Dec 2025 00:00:00 +0200 OJS 3.3.0.13 http://blogs.law.harvard.edu/tech/rss 60 LASER INFORMATION SYSTEM FOR MONITORING GLUE EVAPORATION DURING DRYING https://csitjournal.khmnu.edu.ua/index.php/csit/article/view/471 <p><em>In the context of advancing technologies in printing, materials processing, and chemical production, there is a growing demand for precision in monitoring volatile emissions generated during drying processes. Conventional control methods often lack the speed, sensitivity, or selectivity required for real-time analysis of adhesive evaporation products, particularly in dynamic industrial environments such as book block drying chambers.</em></p> <p><em>This study substantiates the development of laser-based resonance photometric systems for evaluating the concentration of evaporated chemical components from adhesive mixtures. A methodological framework is proposed for implementing dual-channel laser photometry, enabling non-contact, rapid, and spatially resolved assessment of volatile content. The structure and operating principles of a laboratory-grade concentration meter are presented, incorporating semiconductor lasers, photodetectors, signal processing units, and intelligent control agents.</em></p> <p><em>Special attention is given to the physical and quantum-chemical interactions of laser beams with aqueous media, as well as to the solubility dynamics of adhesive components under thermal influence. A two-stage selective photoionization approach is introduced to enhance signal amplification through resonance effects, enabling more precise quantification of emissions.</em></p> <p><em>The proposed methods can be applied not only to printing production environments but also to broader ecological monitoring tasks involving laser-based diagnostics of airborne and liquid-phase contaminants. These technologies contribute to the development of next-generation intelligent sensing systems capable of supporting environmental safety and process optimization.</em></p> <em><!--a=1--></em> Olga FEDEVYCH Copyright (c) 2025 Ольга ФЕДЕВИЧ https://creativecommons.org/licenses/by/4.0 https://csitjournal.khmnu.edu.ua/index.php/csit/article/view/471 Tue, 30 Dec 2025 00:00:00 +0200 APPROACH TO A DECENTRALIZED PHYSICIAN-ORIENTED EHR ARCHITECTURE WITH CRYPTOGRAPHIC PROTECTION https://csitjournal.khmnu.edu.ua/index.php/csit/article/view/481 <p><em>Modern electronic health record (EHR) systems face challenges related to security, privacy, and accessibility, especially in centralized architectures where there are risks of database compromise, data leakage, and limited interoperability between institutions and/or data storage nodes. Distributed systems in which servers operate autonomously without requiring constant connectivity demand decentralized solutions with cryptographic protection and minimal user-side requirements. A physician-centric approach enables medical institutions to optimize workflow within a trusted environment while preserving transparency for patients regarding data access.</em></p> <p><em>The proposed architecture combines local physician nodes with a shared archival registry node used for long-term data storage and patient access. Protection is achieved through envelope encryption with combined DEKs, daily rotation of server keys, and internal hash chains for detecting unauthorized modifications. The system supports profile migration between nodes, exchange of signed data within a local network, and offloading of completed records to the registry to optimize resource usage.</em></p> <p><em>A key principle is transparency: any data decryption is accompanied by a patient notification, and if the notification subsystem is unavailable, the operation is not performed. Access is logged with identification of the entity performing the decryption. Profile creation begins on the physician’s node: data are encrypted with a combined DEK, and hash chains ensure integrity. Two independently encrypted copies of the user key enable administrators to restore access without exposing key material. Data exchange between physicians occurs within the local network with signature verification, making the system resilient to failures.</em></p> Volodymyr KYSIL, Tetiana KYSIL Copyright (c) 2026 Володимир КИСІЛЬ, Тетяна КИСІЛЬ https://creativecommons.org/licenses/by/4.0 https://csitjournal.khmnu.edu.ua/index.php/csit/article/view/481 Tue, 30 Dec 2025 00:00:00 +0200 METHODS OF HIDING DATA IN COMPUTER NETWORKS: FROM CLASSICS TO IoT AND Ai https://csitjournal.khmnu.edu.ua/index.php/csit/article/view/484 <p><em>The article presents an overview of key methods in network steganography, including the classification of data hiding techniques in network protocols and discussion of promising directions for further research. Special attention is paid to the use of steganography in modern network environments such as IoT, as well as the application of artificial intelligence to traffic masking. The paper also outlines current approaches to hidden channel detection and threat modeling in digital communication systems.</em></p> Mykhailo SHELEST, Yurii PIDLISNYI, Mariia KAPUSTIAN Copyright (c) 2026 Mariia Kapustian https://creativecommons.org/licenses/by/4.0 https://csitjournal.khmnu.edu.ua/index.php/csit/article/view/484 Tue, 30 Dec 2025 00:00:00 +0200 DECISION SUPPORT SYSTEM FOR PROJECT RESOURCE PLANNING BASED ON THE RANDOM FOREST METHOD https://csitjournal.khmnu.edu.ua/index.php/csit/article/view/488 <p><em>The study develops and justifies the structure of a decision support system (DSS) designed to automate project resource planning processes using the Random Forest method. The relevance of the research is driven by the necessity to transition from subjective estimates to analytical tools for forecasting project costs and duration. The proposed system architecture covers the full data processing cycle: from automated input data collection from corporate databases (such as Jira or MS Project) to the generation of visual reports for management. Implementing the Random Forest algorithm within the DSS framework enables the identification of critical project parameters, specifically technical complexity and external risks, directly at the initiation and planning stages.</em> <em>Special emphasis is placed on the development and implementation of a feature importance visualization mechanism, which transforms the forecasting model into a transparent analytical tool. This allows managers to not only obtain predicted values but also understand the underlying structure of the factors influencing them. It was established that the feature hierarchy, where technical complexity plays a leading role (0.793), enables the project manager to focus on the most critical planning nodes. Such an approach significantly enhances the transparency of decision-making and fosters increased stakeholder trust in the system's recommendations.</em> <em>The practical significance of the results lies in the possibility of implementing predictive management methods. The system identifies potential project bottlenecks before actual difficulties arise, providing the manager with a basis for timely reviews of team composition, budget limit adjustments, or schedule modifications. Thus, the proposed DSS serves as an effective tool for active management, providing decision support to prevent cost overruns and project schedule delays in dynamic environments.</em></p> <p>&nbsp;</p> Yelyzaveta HNATCHUK , Mariia LEBEDOVSKA Copyright (c) 2026 Єлизавета ГНАТЧУК, Марія ЛЕБЕДОВСЬКА https://creativecommons.org/licenses/by/4.0 https://csitjournal.khmnu.edu.ua/index.php/csit/article/view/488 Tue, 30 Dec 2025 00:00:00 +0200 CROP YIELD MODEL BASED ON MAXIMUM VALUES OF CUMULATIVE VEGETATION INDICES https://csitjournal.khmnu.edu.ua/index.php/csit/article/view/463 <p><em>This research develops a precision modeling approach for cereal crop yield estimation utilizing remote sensing data within a secure information architecture framework. A two-tier model is proposed wherein the first tier conducts vegetation index dynamics modeling (NDVI, MTCI) through an adaptive modified Monod model based on contemporary differential equation systems, while the second tier performs yield prediction via linear regression and machine learning methodologies to accommodate nonlinear interdependencies. An efficient parametric identification algorithm for models is developed, accounting for their nonlinearity characteristics and employing the Levenberg-Marquardt gradient method for refined parameter optimization.</em></p> <p><em>A multi-tier information architecture incorporating blockchain technologies is proposed as a decentralized layer for ensuring data integrity and authenticity to mitigate cyber threats including data poisoning attacks and industrial espionage. An adaptive prediction algorithm based on observation window methodology is implemented, leveraging an ensemble of previously observed trajectories to maximize forecasting precision.</em></p> <p><em>Practical applicability is validated through numerical experiments on empirical vegetation index data from rice cultivation. The synthesized findings demonstrate the potential of the proposed methodology for addressing contemporary precision agriculture challenges, systematic food security monitoring, and strategic decision-making processes in the agricultural sector.</em></p> Roman PASICHNYK, Mykhailo MACHULYAK Copyright (c) 2025 Роман ПАСІЧНИК, Михайло МАЧУЛЯК https://creativecommons.org/licenses/by/4.0 https://csitjournal.khmnu.edu.ua/index.php/csit/article/view/463 Tue, 30 Dec 2025 00:00:00 +0200 ENTROPY-CRYPTOGRAPHIC APPROACH FOR TRANSMISSION OF SATELLITE DATA IN TELECOMMUNICATION NETWORKS https://csitjournal.khmnu.edu.ua/index.php/csit/article/view/476 <p><em>Amid rapid advances in satellite observations and increasing risks associated with high-resolution multispectral data downloads, stronger satellite image encryption techniques are becoming increasingly important. To address these challenges, the paper proposes a new cryptographic methodology based on entropy and nonlinear methods. This approach offers high resistance against brute-force attacks, generates a high-entropy key, and employs a dynamic non–linear transformation in multi-layered encryption, which is AES-256-bit encrypted. This comprehensive approach is designed to safeguard the security of satellite data in transit from modern cryptanalysis as it traverses communication networks. The approach begins with entropy-based initializations, iteratively expands the key space, and applies adaptive transformations to render the encrypted data highly unpredictable. Experiments with satellite images of various resolutions demonstrate that this approach is resistant to cryptanalysis. The evaluation included measuring entropy, analysing the correlation between neighbouring pixels, and testing resistance to statistical and frequency-based attacks. The encrypted images achieved entropy values close to the theoretical maximum and showed almost no correlation between adjacent pixels, demonstrating the strength and uniformity of the encryption process. Performance tests on systems with multiple threads and processors revealed clear links between execution time, data size, and the level of parallelization. Moderate parallelism provided the best speed improvements, and the method remained scalable for large datasets, making it suitable for high-throughput environments. This approach ensures strong satellite-image robustness in image size, content, or spectral characteristics. The flexible structure and good performance make it a promising candidate for future telecom networks and secure satellite data distribution systems.</em></p> Vita KASHTAN, Volodymyr HNATUSHENKO Copyright (c) 2026 Віта КАШТАН, Володимир ГНАТУШЕНКО https://creativecommons.org/licenses/by/4.0 https://csitjournal.khmnu.edu.ua/index.php/csit/article/view/476 Tue, 30 Dec 2025 00:00:00 +0200 METHOD FOR SYNTHESIS OF A SCALABLE ARCHITECTURE OF A DISTRIBUTED CS, RESISTANT TO SOCIAL ENGINEERING ATTACKS https://csitjournal.khmnu.edu.ua/index.php/csit/article/view/482 <p><strong> </strong><em>Social engineering continues to be one of the most dangerous classes of threats for modern distributed IT systems, where event processing, resource access, and protection mechanisms are performed on a large number of heterogeneous nodes. The growth of the scale of architectures, the emergence of multi-channel interaction scenarios, remote users, and a high level of dynamism create challenges for the synthesis of systems that are able to maintain resistance to social engineering attacks. The study proposes methods and tools for the synthesis of distributed systems focused on ensuring structural, behavioral, and functional resistance to such attacks.</em></p> <p><em>The basis of the approach is the use of a population multi-agent mean-field model, which allows considering a large number of nodes as a coordinated system of local detectors interacting through an aggregated state space. This makes it possible to describe the impact of attacks not on individual components, but on the entire distributed system as a whole, and to evaluate its response through integrated risk and resilience indicators. The study forms a generalized model of a distributed system, defines the roles of different types of nodes, protections and interaction channels, and also describes the methodology for architecture synthesis, which includes the classification of local actions, coordination mechanisms and evaluation criteria.</em></p> <p><em>Special attention is paid to the integration of protective measures - deception components, multifactor authentication, filtering and segmentation mechanisms - into the structure of a distributed system. Methods for optimizing the distribution of these measures at different levels of the architecture are proposed in accordance with the dynamics of the mean field and target requirements for stability. An iterative approach to architecture synthesis is developed, which combines the adaptation of local node strategies with the tuning of global system parameters.</em></p> <p><em>The results demonstrate that the use of the mean field concept allows to ensure scalability of solutions, consistency of node behavior, and also to increase the ability of a distributed system to counteract social engineering attacks in conditions of uncertainty and high variability of scenarios. The methodology can be used for the design, improvement and engineering synthesis of real distributed IT architectures operating in critical environments.</em></p> Oleksandr BOKHONKO, Olha ATAMANIUK Copyright (c) 2026 Олександр БОХОНЬКО, Ольга АТАМАНЮК https://creativecommons.org/licenses/by/4.0 https://csitjournal.khmnu.edu.ua/index.php/csit/article/view/482 Tue, 30 Dec 2025 00:00:00 +0200 AI-BASED AUTOMATION OF DECISION LOGIC REPRESENTATION: BRIDGING GAP IN AUTOMATED DECISION MODELING VALIDATION https://csitjournal.khmnu.edu.ua/index.php/csit/article/view/486 <p class="06AnnotationVKNUES"><em>The study aims to resolve the "modeling bottleneck" in Business Process Management by developing an automated method for ensuring the correctness of Decision Model and Notation (DMN) tables generated by Large Language Models (LLMs). The primary goal is to determine whether shifting the AI's role from a pure generator to a validator within a closed-loop system can overcome the structural limitations inherent in stochastic models.</em></p> <p class="06AnnotationVKNUES"><em>The research employs a comparative experimental design using a "Mutation Testing" approach. We analyze two distinct workflows: (1) Static Generation, where test cases are fixed, and (2) Dynamic Paired Generation, where the LLM regenerates both the decision logic (DMN XML) and the validation criteria (Test Cases JSON) simultaneously upon failure. The methodology integrates concepts from "Struc-Bench" for structural analysis and utilizes a deterministic DMN engine (Camunda) for execution-based verification.</em></p> <p class="06AnnotationVKNUES"><em>The experiments demonstrate that "out-of-the-box" LLM generation fails in approximately 4.5% of cases due to semantic drift and structural hallucinations when validated against static benchmarks. However, the proposed "Dynamic Paired Generation" workflow achieved a 100% convergence rate across 200 cycles. The system successfully identified and corrected both syntactic errors (XML schema violations) and logical errors (Hit Policy violations) without human intervention.</em></p> <p class="06AnnotationVKNUES"><em>The study introduces the concept of "Dynamic Paired Generation" for neuro-symbolic systems. Unlike traditional "Chain-of-Thought" prompting, this approach leverages the mutual consistency between two independent structural representations (Logic and Examples) to filter out hallucinations, proving that dynamic validation is superior to static prompting for structured data tasks.</em></p> <p class="06AnnotationVKNUES"><em>The proposed framework provides a blueprint for "Self-Healing" decision management systems. It allows non-technical business analysts to convert natural language policies into executable, error-free DMN models, significantly reducing the time and cost of regulatory compliance and process automation.</em></p> Vladyslav MALIARENKO Copyright (c) 2026 Владислав МАЛЯРЕНКО https://creativecommons.org/licenses/by/4.0 https://csitjournal.khmnu.edu.ua/index.php/csit/article/view/486 Tue, 30 Dec 2025 00:00:00 +0200 ANALYTICAL WEB SERVICE FOR IDENTIFYING SUSPICIOUS HIGH-RISK DIGITAL ASSET TRANSACTIONS https://csitjournal.khmnu.edu.ua/index.php/csit/article/view/483 <p><em>The rapid growth of digital asset transactions significantly complicates their analysis and monitoring. Although blockchain provides transparency of operations, the high level of pseudonymity among participants creates substantial challenges in identifying the nature of interactions and detecting potentially undesirable activity. This increases the demand for modern tools capable of automatically processing large volumes of data, grouping addresses into clusters, and evaluating their behavior based on aggregated transactional characteristics. The development of a web service for digital asset transaction analytics that automatically forms address clusters and classifies them using machine learning models is proposed. The system provides an informative and interpretable data presentation, enabling users to assess the nature of activity associated with a given address. The methodology is based on heuristics for clustering blockchain addresses in networks utilizing the UTXO model, as well as on the calculation of structural and behavioral characteristics of the formed clusters. Random Forest, Extra Trees, and XGBoost models were used for classification, which were trained on a labeled subset of the scientific dataset. The technical implementation of the web service is based on the Symfony framework for the server side, the React library for the client side, MySQL and PostgreSQL DBMS for data storage, the Python programming language for machine learning, as well as Docker and Nginx tools for deployment and stability. The results of the study demonstrate that machine learning models can effectively classify clusters of digital asset addresses according to their behavioral characteristics, and the integration of algorithms into the web service provides automatic generation of analytical reports. The scientific novelty lies in combining heuristics for address clustering with machine learning models, which allows evaluating the behavior of clusters and identifying potential risk. The practical significance of the developed web service is defined by the possibility of its application for rapid preliminary checks of cryptocurrency addresses, detecting relationships between them, and assessing potential interaction risks, making it valuable for organizations and individual users alike.</em></p> Vitaly KOCHURA , Tamara LOKTIKOVA, Nadia KUSHNIR Copyright (c) 2026 Віталій КОЧУРА, Тамара ЛОКТІКОВА, Надія КУШНІР https://creativecommons.org/licenses/by/4.0 https://csitjournal.khmnu.edu.ua/index.php/csit/article/view/483 Tue, 30 Dec 2025 00:00:00 +0200 METHOD FOR ENHANCING FMECA (XMECA) SAFETY ASSESSMENT PROCEDURES CONSIDERING THE CRITICALITY OF ASSUMPTIONS AND ANALYSIS ERRORS https://csitjournal.khmnu.edu.ua/index.php/csit/article/view/494 <p><em>In existing studies that discuss the Failure Modes, Effects, and Criticality Analysis (FMECA, XMECA) method, two principal limiting factors are commonly identified: the significant influence of engineers’ and auditors’ experience on the resulting safety assessments, and the presence of restrictive assumptions embedded in assessment procedures and supporting tools. To address these limitations, this paper proposes a method for enhancing FMECA (XMECA) safety assessment procedures that explicitly accounts for the criticality of underlying assumptions and analysis errors. Case study of applying the proposed method demonstrate that it can serve as an effective instrument for researchers and developers working on reliability and safety assessment problems in critical systems. Further research is devoted to</em> <em>application of the method in different contexts and industrial sectors.</em></p> Ievgen BABESHKO, Vyacheslav KHARCHENKO, Kostiantyn LEONTIIEV Copyright (c) 2026 Євген БАБЕШКО, Вячеслав ХАРЧЕНКО, Костянтин ЛЕОНТІЄВ https://creativecommons.org/licenses/by/4.0 https://csitjournal.khmnu.edu.ua/index.php/csit/article/view/494 Tue, 30 Dec 2025 00:00:00 +0200 LLM-DRIVEN QUERY GENERATION FOR GRAPH-BASED BUSINESS INTELLIGENCE: TOWARDS A COLLABORATIVE KNOWLEDGE RETRIEVAL TOOL https://csitjournal.khmnu.edu.ua/index.php/csit/article/view/485 <div><em><span lang="EN">This paper explores the use of large language models (LLMs) to support collaborative business intelligence </span></em></div> <div><em><span lang="EN">in the tourism domain through two key tasks: extracting travel-related tags from user queries and generating Cypher queries for accessing knowledge graphs. We focus on evaluating the performance of compact and efficient LLMs, aiming to balance accuracy with computational feasibility. To assess tag extraction, we evaluated Phi-3 Mini, LLaMA 3.2, and Gemma 3 using the DeepEval framework with G-Eval scoring. Phi-3 Mini showed the best balance between accuracy and efficiency, while Gemma 3 achieved the highest scores at the cost of increased resource usage. For Cypher query generation, we tested more powerful models: Mistral Small 3.1, Phi-4, Gemma 3, and ChatGPT-4o. ChatGPT-4o achieved the highest correctness, while Mistral Small demonstrated the best trade-off among smaller models. Our results suggest that lightweight LLMs are suitable for basic natural language processing tasks, but structured query generation remains challenging and requires stronger models. Further research is needed to improve the reliability of generated queries and to develop robust validation mechanisms. This study introduces a comparative evaluation of lightweight and standard LLMs specifically applied to collaborative business intelligence in the tourism domain. It highlights the feasibility of using compact LLMs for natural language processing tasks while demonstrating the challenges of structured query generation, which requires more powerful models.</span></em></div> Oleksandr SUTIAHIN, Olga CHEREDNICHENKO Copyright (c) 2026 Олександр СУТЯГІН, Ольга ЧЕРЕДНІЧЕНКО https://creativecommons.org/licenses/by/4.0 https://csitjournal.khmnu.edu.ua/index.php/csit/article/view/485 Tue, 30 Dec 2025 00:00:00 +0200 USE OF GRADATION CORRECTION METHOD FOR IMPROVING THE VISUALIZATION OF COMPUTED TOMOGRAPHY RESULTS https://csitjournal.khmnu.edu.ua/index.php/csit/article/view/473 <p><em>The aim of this work is to enhance the quality of visualization of computed tomography (CT) results by improving digital image processing methods, particularly through the use of the gradation correction (GC). The research methodology involves the application of algorithms for the automatic determination of the optimal pixel brightness range in CT images, which reduces the influence of extreme brightness values and increases the contrast of diagnostically significant structures. The proposed gradation correction is applied not within the standard 0–255 range but based on the statistical distribution of brightness, with the exclusion of non-informative tail regions of the histogram. The research results indicate that the proposed approach improves the visual clarity of organ boundaries and pathological formations without loss of diagnostic information and can be integrated into automated medical image analysis systems. The scientific novelty of the work lies in the enhancement of the gradation correction method for CT image processing through the automation of brightness range selection. The practical significance of the study is in improving the efficiency of visual analysis and computer-aided diagnostics at the stages of preliminary medical image processing, contributing to higher accuracy in pathology detection and reduced evaluation time.</em></p> Eugen Vakulik, Kirill Smelyakov Copyright (c) 2026 Євген ВАКУЛІК, Кирило СМЕЛЯКОВ https://creativecommons.org/licenses/by/4.0 https://csitjournal.khmnu.edu.ua/index.php/csit/article/view/473 Tue, 30 Dec 2025 00:00:00 +0200 IMPROVEMENT OF THE AUTOMATED NLP SYSTEM AS A FACTOR IN IMPROVING THE QUALITY OF MARKETING STRATEGY FORMATION https://csitjournal.khmnu.edu.ua/index.php/csit/article/view/469 <p><em>Along with the rapid growth of the Internet of Things and edge computing, collaborative training of machine learning models on resource-constrained edge devices, while ensuring user data privacy, has become a key and challenging research challenge. The presented paper aims to fill this research gap by designing, implementing, and evaluating a synergistic and optimized federated learning infrastructure with differential privacy called PriFed-IoT, specifically designed for Internet of Things edge computing scenarios. The novelty of this work lies in the creation of a system of several modules working together, rather than a simple combination of techniques. The main idea is to use adaptive differential privacy to create an environment with a higher signal-to-noise ratio for the clustering algorithm in the late stages of training, allowing for more accurate separation of clients. In order to test the efficiency of the PriFed-IoT infrastructure, a series of complex simulation experiments were developed and conducted on a standard CIFAR-10 dataset, simulating different degrees of data heterogeneity. The study successfully offers a comprehensive solution, the experimental results convincingly prove the advantages of PriFed-IoT infrastructure in balancing privacy protection, model utility, and system efficiency, providing a valuable theoretical framework and technical implementation for building secure, efficient, and reliable intelligent applications at the edge of the Internet of Things</em><em>.</em></p> <p> </p> Yuriy SKORIN Copyright (c) 2026 Юрій СКОРІН https://creativecommons.org/licenses/by/4.0 https://csitjournal.khmnu.edu.ua/index.php/csit/article/view/469 Tue, 30 Dec 2025 00:00:00 +0200