We examine the 2002 Yakutsk wildfire event and simulate the impacts of smoke aerosols on local radiative energy budget, using the WRF-Chem-SMOKE model. When comparing satellite retrievals (the Surface Radiation Budget (SRB) dataset) with model simulations, we found that the agreement is generally good, except for the regions where the model predicts too few clouds or SRB misclassifies strong smoke plumes as clouds. We also found that the smoke-induced changes in upward shortwave fluxes at top of atmosphere (TOA) vary under different burning and meteorological conditions. In the first period of the fire season (9–12 August), smoke particles cause a warming effect around 3 W/m2, mainly through functioning as ice nuclei, which deplete the cloud water amount in the frontal system. At the beginning of the second period of the fire season (19–20 August), large amounts of pre-existing smoke particles cause a strong cooling effect of −8 W/m2. This is offset by the warming effect caused by relatively small amounts of cloud condensation nuclei increases, which promotes the rain formation and depletes the cloud water amount. After the cloud decks are well mixed with smoke plumes (21–22 August), the first indirect and direct effects of smoke together lead to a cooling of −10 W/m2. These results highlight the importance of meso-scale modeling efforts in estimating the smoke-induced changes in the radiative energy budget over high latitudes.