Logo PTI
Polish Information Processing Society
Logo FedCSIS

Annals of Computer Science and Information Systems, Volume 15

Proceedings of the 2018 Federated Conference on Computer Science and Information Systems

Towards a Framework for Semi-Automated Annotation of Human Order Picking Activities Using Motion Capturing

, , ,

DOI: http://dx.doi.org/10.15439/2018F188

Citation: Proceedings of the 2018 Federated Conference on Computer Science and Information Systems, M. Ganzha, L. Maciaszek, M. Paprzycki (eds). ACSIS, Vol. 15, pages 817821 ()

Full text

Abstract. Data creation for Human Activity Recognition (HAR) requires an immense human effort and contextual knowledge for manual annotation. This paper proposes a framework for semi-automated annotation of sequential data in the order picking process using a motion capturing system. Additionally, it introduces proper annotation labels by defining process steps, human activities and simple human movements in order picking scenarios. An attribute representation based on simple human movements meets the challenges set by the versatility of activities in warehousing.


  1. R. Manzini, Ed., Warehousing in the global supply chain: advanced models, tools and applications for storage systems. Springer, 2012.
  2. E. H. Grosse, C. H. Glock, and W. P. Neumann, “Human factors in order picking: a content analysis of the literature,” International Journal of Production Research, vol. 55, no. 5, pp. 1260–1276, Mar. 2017.
  3. K. Weisner and J. Deuse, “Assessment Methodology to Design an Ergonomic and Sustainable Order Picking System Using Motion Capturing Systems,” Procedia CIRP, vol. 17, pp. 422–427, Jan. 2014.
  4. A. Bulling, U. Blanke, and B. Schiele, “A tutorial on human activity recognition using body-worn inertial sensors,” ACM Computing Surveys, vol. 46, no. 3, pp. 1–33, Jan. 2014.
  5. S. Feldhorst, M. Masoudenijad, M. ten Hompel, and G. A. Fink, “Motion Classification for Analyzing the Order Picking Process Using Mobile Sensors,” in Proceedings of the 5th International Conference on Pattern Recognition Applications and Methods, ser. ICPRAM 2016. Portugal: SCITEPRESS - Science and Technology Publications, Lda, 2016, pp. 706–713.
  6. R. Grzeszick, J. M. Lenk, F. M. Rueda, G. A. Fink, S. Feldhorst, and M. ten Hompel, “Deep Neural Network based Human Activity Recognition for the Order Picking Process,” in Proceedings of the 4th international Workshop on Sensor-based Activity Recognition and Interaction. ACM Press, 2017, pp. 1–6.
  7. D. R. Francisco Javier Ordóñez, “Deep Convolutional and LSTM Recurrent Neural Networks for Multimodal Wearable Activity Recognition,” Sensors, vol. 16, no. Advances on Data Transmission and Analysis for Wearable Sensors Systems, p. 115, 2016.
  8. N. Y. Hammerla, S. Halloran, and T. Ploetz, “Deep, Convolutional, and Recurrent Models for Human Activity Recognition using Wearables,” CoRR, Apr. 2016.
  9. F. M. Rueda and G. A. Fink, “Learning Attribute Representation for Human Activity Recognition,” https://arxiv.org/abs/1802.00761 [cs], Feb. 2018. [Online]. Available: http://arxiv.org/abs/1802.00761
  10. O. M. Parkhi, A. Vedaldi, and A. Zisserman, “Deep Face Recognition,” in British Machine Vision Conference, 2015.
  11. S. Feldhorst, S. Aniol, and M. ten Hompel, “Human Activity Recognition in der Kommissionierung – Charakterisierung des Kommissionierprozesses als Ausgangsbasis für die Methodenentwicklung,” Logistics Journal : Proceedings, vol. 2016, no. 10, Oct. 2016.
  12. H.-T. Cheng, F.-T. Sun, M. Griss, P. Davis, J. Li, and D. You, “NuActiv: Recognizing Unseen New Activities Using Semantic Attribute-based Learning,” in Proceeding of the 11th Annual International Conference on Mobile Systems, Applications, and Services, ser. MobiSys ’13. New York, NY, USA: ACM, 2013, pp. 361–374.
  13. J. Zheng, Z. Jiang, and R. Chellappa, “Submodular Attribute Selection for Visual Recognition,” in IEEE Transactions on Pattern Analysis and Machine Intelligence ( Volume: 39, Issue: 11, ), ser. 11, vol. 39. IEEE, Nov. 2017, pp. 2242 – 2255.
  14. A. Vinciarelli, A. Esposito, E. André, F. Bonin, M. Chetouani, J. F. Cohn, M. Cristani, F. Fuhrmann, E. Gilmartin, Z. Hammal, D. Heylen, R. Kaiser, M. Koutsombogera, A. Potamianos, S. Renals, G. Riccardi, and A. A. Salah, “Open Challenges in Modelling, Analysis and Synthesis of Human Behaviour in Human–Human and Human–Machine Interactions,” Cognitive Computation, vol. 7, no. 4, pp. 397–413, Aug. 2015.
  15. A. K. R. Venkatapathy, H. Bayhan, F. Zeidler, and M. t. Hompel, “Human machine synergies in intra-logistics: Creating a hybrid network for research and technologies,” in 2017 Federated Conference on Computer Science and Information Systems (FedCSIS), Sep. 2017, pp. 1065–1068.
  16. J. Haase and D. Beimborn, “Acceptance of Warehouse Picking Systems: A Literature Review,” in Proceedings of the 2017 ACM SIGMIS Conference on Computers and People Research, ser. SIGMIS-CPR ’17. New York, NY, USA: ACM, 2017, pp. 53–60.
  17. D. Battini, M. Calzavara, A. Persona, and F. Sgarbossa, “Additional effort estimation due to ergonomic conditions in order picking systems,” International Journal of Production Research, vol. 55, no. 10, pp. 2764–2774, May 2017.
  18. R. Müller-Rath, C. Disselhorst-Klug, S. Williams, C. Braun, and O. Miltner, “Einfluss des Geschlechts und der Seitendominanz auf die Ergebnisse der quantitativen, dreidimensionalen Bewegungsanalyse der oberen Extremitäten,” Zeitschrift für Orthopädie und Unfallchirurgie, vol. 147, no. 04, pp. 463–471, Jul. 2009.
  19. D. Roggen, A. Calatroni, M. Rossi, T. Holleczek, K. Förster, G. Tröster, P. Lukowicz, D. Bannach, G. Pirkl, A. Ferscha, J. Doppler, C. Holzmann, M. Kurz, G. Holl, R. Chavarriaga, H. Sagha, H. Bayati, M. Creatura, and J. d. R. Millàn, “Collecting complex activity datasets in highly rich networked sensor environments,” in 2010 Seventh International Conference on Networked Sensing Systems (INSS), Jun. 2010, pp. 233–240.
  20. R. Grzeszick, S. Sudholt, and G. A. Fink, “Optimistic and Pessimistic Neural Networks for Scene and Object Recognition,” https://arxiv.org/abs/1609.07982 [cs], Sep. 2016.