GRID SEARCH-DRIVEN HYPERPARAMETER OPTIMIZATION FOR RANDOM FOREST MODEL IN TEXT ANALYSIS
Random Forest, a widely used ensemble learning algorithm, exhibits strong classification performance; however, its accuracy and computational efficiency are highly dependent on appropriate hyperparameter tuning. To address this, the research employs a systematic grid search optimization approach to identify the optimal combination of key Random Forest parameters. Experimental results demonstrate that the optimized Random Forest model achieved an accuracy of 98.40%, outperforming the baseline model across all key performance metrics, including precision, recall, and F1-score, with improvements of up to 0.31%. Notably, the optimized configuration also reduced training time by approximately 83.72%, underscoring the significant computational efficiency gained through systematic tuning. These findings clearly prove that structured grid search optimization not only enhances predictive performance but also reduces computational cost, making it a robust and interpretable approach for machine learning model refinement.
Random Forest, Grid Search, Hyperparameter, Classification, Optimization, Machine Learning