Human action and activity recognition from videos has attracted an increasing number of researchers in recent years. However, most of the works aim at multimedia retrieval and surveillance applications, but rarely at humanoid household robots, even though the robotic perception of human activities would allow a more natural human-robot interaction (HRI). To encourage future studies in this domain, we present in this work a novel data set specifically designed for the application in HRI scenarios. This Robo-kitchen data set consists of 14 typical kitchen activities recorded in two different stereo-camera setups, and each performed by 17 subjects. To establish a baseline for future work, we extend a state-of-the-art action recognition method to be applicable on the activity classification problem and evaluate it on the Robo-kitchen data set showing promising results.