In this paper we investigate a new architecture for recognizing human group actions in meetings. These group actions provide a basis that enables effective browsing and querying in a meeting archive. For this task we propose an architecture that was inspired by the neural field theory. Our approach is particular, because contrary to other methods, we present all features to our classifier in parallel. The experiments show, that our system has comparable results to existing sequential techniques.