Skip to main content

Advertisement

Log in

Task recognition from joint tracking data in an operational manufacturing cell

  • Published:
Journal of Intelligent Manufacturing Aims and scope Submit manuscript

Abstract

This paper investigates the feasibility of using inexpensive, general-purpose automated methods for recognition of worker activity in manufacturing processes. A novel aspect of this study is that it is based on live data collected from an operational manufacturing cell without any guided or scripted work. Activity in a single-worker cell was recorded using the Microsoft Kinect, a commodity-priced sensor that records depth data and includes built-in functions for the detection of human skeletal positions, including the positions of all major joints. Joint position data for two workers on different shifts was used as input to a collection of learning algorithms with the goal of classifying the activities of each worker at each moment in time. Results show that unsupervised and semisupervised algorithms, such as unsupervised hidden Markov models, show little loss of accuracy compared to supervised methods trained with ground truth data. This conclusion is important because it implies that automated activity recognition can be accomplished without the use of ground truth labels, which can only be obtained by time-consuming manual review of videos. The results of this study suggest that intelligent manufacturing can now include detailed process-control measures of human workers with systems that are affordable enough to be installed permanently for continuous data collection.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

Notes

  1. http://people.virginia.edu/~djr7m/incom2015/.

References

  • Adams, S. (2015). Simultaneous feature selection and parameter estimation for hidden Markov models. Dissertation, University of Virginia.

  • Chen, T., Wang, Y.-C., & Lin, Z. (2015). Predictive distant operation and virtual control of computer numerical control machines. Journal of Intelligent Manufacturing, 1–17. doi:10.1007/s10845-014-1029-x.

  • Choo, B., Landau, M., DeVore, M., & Beling, P. A. (2014). Statistical analysis-based error models for the Microsoft Kinect depth sensor. Sensors, 14(9), 17430–17450. doi:10.3390/s140917430. http://www.mdpi.com/1424-8220/14/9/17430.

  • Cohen, I., Cozman, F. G., & Bronstein, A. (2002). The effect of unlabeled data on generative classifiers, with application to model selection. In Proceedings of AAAI (submitted). http://www.hpl.hp.com/techreports/2002/HPL-2002-140.pdf.

  • Deitz, R., & Orr, J. (2006). A leaner, more skilled U.S. Manufacturing Workforce. Current Issues in Economics and Finance, 12(2), 1–7. http://www.newyorkfed.org/research/current_issues/ci12-2.html.

  • Freedman, B., Shpunt, A., Machline, M., & Arieli, Y. (2010). Depth mapping using projected patterns. Publication number: US 2010/0118123 A1 U.S. Classification: 348/46.

  • Gilbreth, F. B., & Gilbreth, L. M. (1916). Fatigue study: The elimination of humanity’s greatest unnecessary waste. Whitefish: Kessinger Publishing.

    Google Scholar 

  • Grice, A., Peer, J., & Morris, G. (2011). Today’s aging workforce—Who will fill their shoes?. In Protective relay engineers, 2011 64th annual conference for (pp. 483–491). doi:10.1109/CPRE.2011.6035641.

  • Huikari, V., Koskimaki, H., Siirtola, P., & Roning, J. (2010). User-independent activity recognition for industrial assembly lines-feature vs. instance selection. In Pervasive computing and applications (ICPCA), 2010 5th international conference on (pp. 307–312). IEEE.

  • IFR International Federation of Robotics: Robots Create Jobs- IFR International Federation of Robotics (2013). http://www.ifr.org/robots-create-jobs/.

  • Knight, W. (2012). This robot could transform manufacturing. http://www.technologyreview.com/news/429248/this-robot-could-transform-manufacturing/.

  • Koskimaki, H., Huikari, V., Siirtola, P., Laurinen, P., & Roning, J. (2009). Activity recognition using a wrist-worn inertial measurement unit: A case study for industrial assembly lines. In Control and automation, 2009. MED’09. 17th Mediterranean conference on (pp. 401–405). IEEE.

  • Krause, A., Siewiorek, D. P., Smailagic, A., & Farringdon, J. (2003). Unsupervised, dynamic identification of physiological and activity context in wearable computing. In 2012 16th international symposium on wearable computers (pp. 88–88). IEEE Computer Society.

  • Landau, M., Choo, B., & Beling, P. A. (2015). Simulating kinect infrared and depth images. IEEE Transactions on Cybernetics, 14. doi:10.1109/TCYB.2015.2494877.

  • Mahdaviani, M., & Choudhury, T. (2008). Fast and scalable training of semi-supervised crfs with application to activity recognition. In Advances in Neural Information Processing Systems (pp. 977–984).

  • Murphy, K. P. (2012). Machine learning: A probabilistic perspective. Cambridge: The MIT Press.

    Google Scholar 

  • Niebles, J. C., Wang, H., & Fei-Fei, L. (2008). Unsupervised learning of human action categories using spatial–temporal words. International Journal of Computer Vision, 79(3), 299–318.

    Article  Google Scholar 

  • Nigam, K., Mccallum, A. K., Thrun, S., & Mitchell, T. (2000). Text classification from labeled and unlabeled documents using EM. Machine Learning, 39(2–3), 103–134. doi:10.1023/A:1007692713085. http://link.springer.com/article/10.1023/A%3A1007692713085.

  • Oreifej, O., & Liu, Z. (2013). Hon4d: Histogram of oriented 4d normals for activity recognition from depth sequences. In Computer vision and pattern recognition (CVPR), 2013 IEEE conference on (pp. 716–723). IEEE.

  • Packer, B., Saenko, K., & Koller, D. (2012). A combined pose, object, and feature model for action understanding. In 2012 IEEE conference on computer vision and pattern recognition (CVPR) (pp. 1378 –1385). doi:10.1109/CVPR.2012.6247824.

  • Rabiner, L. (1989). A tutorial on hidden Markov models and selected applications in speech recognition. Proceedings of the IEEE, 77(2), 257–286. doi:10.1109/5.18626.

    Article  Google Scholar 

  • Rokach, L., & Maimon, O. (2006). Data mining for improving the quality of manufacturing: A feature set decomposition approach. Journal of Intelligent Manufacturing, 17(3), 285–299.

    Article  Google Scholar 

  • Rokach, L., Romano, R., & Maimon, O. (2008). Mining manufacturing databases to discover the effect of operation sequence on the product quality. Journal of Intelligent manufacturing, 19(3), 313–325.

    Article  Google Scholar 

  • Rude, D. J., Adams, S., & Beling, P. A. (2015). A benchmark dataset for depth sensor based activity recognition in a manufacturing process. IFAC-PapersOnLine, 48(3), 668–674. doi:10.1016/j.ifacol.2015.06.159. http://www.sciencedirect.com/science/article/pii/S2405896315003985.

  • Shotton, J., Fitzgibbon, A., Cook, M., Sharp, T., Finocchio, M., Moore, R., et al. (2011). Real-time human pose recognition in parts from single depth images. In 2011 IEEE conference on computer vision and pattern recognition (CVPR) (pp. 1297 –1304). doi:10.1109/CVPR.2011.5995316.

  • Stiefmeier, T., Ogris, G., Junker, H., Lukowicz, P., & Troster, G. (2006). Combining motion sensors and ultrasonic hands tracking for continuous activity recognition in a maintenance scenario. In Wearable computers, 2006 10th IEEE international symposium on (pp. 97–104). IEEE.

  • Stikic, M., Van Laerhoven, K., & Schiele, B. (2008). Exploring semi-supervised and active learning for activity recognition. In Wearable computers, 2008. ISWC 2008. 12th IEEE international symposium on (pp. 81–88). IEEE.

  • Sung, J., Ponce, C., Selman, B., & Saxena, A. (2011). Human activity detection from RGBD Images. In Workshops at the twenty-fifth AAAI conference on artificial intelligence (pp. 47–55).

  • Taylor, F. W. (1913). The principles of scientific management. New York: Harper.

  • Wang, J., Liu, Z., Wu, Y., & Yuan, J. (2012). Mining actionlet ensemble for action recognition with depth cameras. In 2012 IEEE conference on computer vision and pattern recognition (CVPR) (pp. 1290–1297). doi:10.1109/CVPR.2012.6247813.

  • Wang, X., Ma, X., & Grimson, W. E. L. (2009). Unsupervised activity perception in crowded and complicated scenes using hierarchical bayesian models. IEEE Transactions on Pattern Analysis and Machine Intelligence, 31(3), 539–555.

    Article  Google Scholar 

  • Ward, J. A., Lukowicz, P., Troster, G., & Starner, T. E. (2006). Activity recognition of assembly tasks using body-worn microphones and accelerometers. IEEE Transactions on Pattern Analysis and Machine Intelligence, 28(10), 1553–1567.

    Article  Google Scholar 

  • Xia, L., Chen, C. C., & Aggarwal, J. (2012). View invariant human action recognition using histograms of 3d joints. In Computer vision and pattern recognition workshops (CVPRW), 2012 IEEE computer society conference on (pp. 20–27). IEEE.

  • Xu, Y., & Ge, M. (2004). Hidden Markov model-based process monitoring system. Journal of Intelligent Manufacturing, 15(3), 337–350.

    Article  Google Scholar 

  • Yang, X., & Tian, Y. (2012). Eigenjoints-based action recognition using naive-bayes-nearest-neighbor. In Computer vision and pattern recognition workshops (CVPRW), 2012 IEEE computer society conference on (pp. 14–19). IEEE.

  • Zhang, H., & Parker, L. E. (2011). 4-Dimensional local spatio-temporal features for human activity recognition. In Intelligent robots and systems (IROS), 2011 IEEE/RSJ international conference on (pp. 2044–2049). IEEE.

  • Zhu, H., He, Z., & Leung, H. (2012). Simultaneous feature and model selection for continuous hidden Markov models. IEEE Signal Processing Letters, 19(5), 279–282.

    Article  Google Scholar 

Download references

Acknowledgments

We would like to thank Aerojet Rocketdyne for allowing us to collect data at one of their facilities. We also thank reviewers of an earlier manuscript for their valuable feedback. This research was supported in part by both SAIC and the Commonwealth Center for Advanced Manufacturing.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Don J. Rude.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Rude, D.J., Adams, S. & Beling, P.A. Task recognition from joint tracking data in an operational manufacturing cell. J Intell Manuf 29, 1203–1217 (2018). https://doi.org/10.1007/s10845-015-1168-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10845-015-1168-8

Keywords

Navigation