There is an increased dominance of intra-die process variations, creating a need for an accurate and fast statistical timing analysis. Most of the recent proposed approaches assume a Single Input Switching model. Our experiments show that SIS underestimates the mean delay of a stage by upto 20% and overestimates the standard deviation upto 26%. We also show that Multiple Input Switching has a greater impact on statistical timing, than regular static timing analysis. Hence, we propose a modeling technique for gate delay variability, considering MIS. Our model can be efficiently incorporated into most of the statistical timing analysis frameworks. On average over all test cases, our approach underestimates mean delay of a stage by 0.01% and overestimates the standard deviation by only 2%, hence increasing the robustness to process variations. Our modeling technique is independent of the deterministic MIS model, and we show that its sensitivity to variations in the MIS model is small. Ca...