Operating at temperatures well above their melting point, gas turbines' components are subject to terribly high thermal stresses. In order to keep them intact and performing, different cooling techniques are implemented. One of these methods is film cooling. Film cooling implementation in vane cascades has a potential loss expense. Proper assessment of its impact on the vane performance has to be conducted. The CFD approach of modeling each hole and cooling tube autonomously is very computationally expensive. In the current work an assessment of a new, more computationally efficient CFD approach for modelling film cooling was conducted on a vane cascade operating in the transonic regime (M =0.89). The film cooling holes were represented by orifice boundary condition at the vane surface, omitting the need to model internal coolant plenum and cooling tubes mesh, resulting in 180% reduction in grid size and attributed computational cost interpreted in 300% saving in computation time. Uncooled, and film cooled with different configurations and at different blowing ratios (BR) simulations were performed and compared to experimental measurements. A good agreement was obtained for the exit flow angles, vorticity and aerodynamic loss for all the cases (uncooled and cooled). Pitch-averaged exit flow angle outside endwalls regions remains unchanged for all cooling configurations and blowing ratios. The aerodynamic loss was found to be more sensitive to increasing the blowing ratio on the suction side than on the pressure side. The proposed approach of coolant injection modeling is shown to yield reliable results, within the uncertainty of the measurements in most cases. Along with lower computational cost compared to conventional film cooling modeling approach, the new approach is recommended for further analysis for aero and thermal vane cascade flows.