Since a bank of 2D Gabor filters has a large potential to isolate texture according to particular frequencies and orientations, the usage of Gabor features to simulate the visual features extracted from human hand is a very effective way. In this paper, we propose an optimized Gabor features based framework for real-time hand gesture recognition explicitly targeted to depth data. The framework proceeds as follows. First, the hand silhouette is extracted from the acquired depth data and then an extensive set of new local Gabor features are computed to characterize segmented hands by their appearance. Finally, the extracted features are fed into a discriminative Latent-Dynamic Conditional Random Fields (LDCRFs) model for gesture recognition. Evaluations on a typical gesture recognition dataset demonstrate that the proposed approach supersedes a number of popular state-of-the-art approaches by achieving an average recognition rate of 90.6%, without sacrificing computational soundness.