Logo PTI
Polish Information Processing Society
Logo FedCSIS

Annals of Computer Science and Information Systems, Volume 13

Communication Papers of the 2017 Federated Conference on Computer Science and Information Systems

Task Execution Support in Research Activity using RAC System

, ,

DOI: http://dx.doi.org/10.15439/2017F244

Citation: Communication Papers of the 2017 Federated Conference on Computer Science and Information Systems, M. Ganzha, L. Maciaszek, M. Paprzycki (eds). ACSIS, Vol. 13, pages 285291 ()

Full text

Abstract. Research activities carried out daily in research laboratories, but all of research activities involve steady, methodical work that does not produce immediate, visible results. For this reason, a mechanism to maintain motivation when research is not going well or to help students get on track with research when they have just been assigned to a laboratory could be useful. Students that have just begun their research may not yet understand how to proceed. We previously developed a research activity concierge (RAC) system, which is a platform to encompass general research activities, and applied gamification to this system to keep user motivation high. However, even with the RAC, non-research-savvy students have difficulty handling challenges and executing tasks. In this research, we focused on discussions in seminars and introduced a mechanism to support task execution in students' research activities by implementing automatic extraction of task statements into the RAC.

References

  1. S. Deterding, D. Dixon, R. Khaled, and L. Nacke, “From Game Design Elements to Gamefulness: Defining “Gamification”,” in 15th International Academic MindTrek Conference: Envisioning Future Media Environments,, Tampere, Finland, 2011, pp. 9–15, https://doi.org/10.1145/2181037.2181040
  2. K. Werbach and D. Hunter, For the Win: How Game Thinking Can Revolutionize Your Business, Wharton Digital Press, 2012.
  3. R. Cronk, “Using Non-interactive Games to Increase Student Engagement and Participation in Class Discussion,” in World Conference on Educational Multimedia, Hypermedia and Telecommunications, Denver, CO, 2012, pp. 311–315.
  4. I. Kotini and S. Tzelepi, A Gamification-Based Framework for Developing Learning Activities of Computational Thinking, Gamification in Education and Business, Springer International Publishing, 2015, pp. 219–252, http://dx.doi.org/10.1007/978-3-319-10208-5
  5. C. A. Bodnar and R. M. Clark, “Exploring the Impact Gamebased Learning Has on Classroom Environment and Student Engagement Within an Engineering Product Design Class,” in Second International Conference on Technological Ecosystems for Enhancing Multiculturality, Salamanca, Spain, 2014, pp. 191–196, https://doi.org/10.1145/2669711.2669899
  6. S. Ohira, K. Kawanishi, and K. Nagao, “Assessing Motivation and Capacity to Argue in a Gamified Seminar Setting,” in Second International Conference on Technological Ecosystems for Enhancing Multiculturality, Salamanca, Spain, 2014, pp. 197–204, https://doi.org/10.1145/2669711.2669900
  7. C. González, A. Mora, and P. Toledo, “Gamification in Intelligent Tutoring Systems,” in Second International Conference on Technological Ecosystems for Enhancing Multiculturality, Salamanca, Spain, 2014, pp. 221–225, https://doi.org/10.1145/2669711.2669903
  8. S. Ohira, S. Sugiura, and K. Nagao, “Proposed Framework for Gamifying Research Activities,” in Third International Conference on Technological Ecosystems for Enhancing Multiculturality, Porto, Portugal, 2015, pp. 245–250, https://doi.org/10.1145/2808580.2808617
  9. S. Ohira, S. Sugiura, and K. Nagao, “Gamifying Research Activity Support System, “ in Fourth International Conference on Technological Ecosystems for Enhancing Multiculturality, Salamanca, Spain, 2016, pp. 739–745, https://doi.org/10.1145/3012430.3012601
  10. L.-H. Wong, T.-W. Chan, Z.-H. Chen, R. B. King, and S. L. Wong, “The IDC theory: interest and the interest loop,” in 23rd International Conference on Computers in Education, Hangzhou, China, 2015, pp. 804–813.
  11. K. Nagao, K. Kaji, D. Yamamoto, and H. Tomobe, “Discussion Mining: Annotation-Based Knowledge Discovery from Real World Activities,” in Fifth PacificRim Conference on Multimedia, Tokyo, Japan, 2004, pp. 522–531.
  12. T. Ozono and T. Sintani, “P2P based Information Retrieval on Research Support System Papits,” in IASTED International Conference on Artificial and Computational Intelligence, Bologna, Italy, 2002, pp. 49–50, https://doi.org/10.1145/544741.544755
  13. M. L. M. Kiah, B. B. Zaidan, A. A. Zaidan, M. Nabi, and R. Ibraheem, “MIRASS: medical informatics research activity support system using information mashup network,” Journal of medical systems, vol. 38, no. 4, 2014, pp. 1–15, https://doi.org/10.1007/s10916-014-0037-x
  14. Y. Miyadera, S. Nakamura, and T. Nanashima, “LabChart: A Support System for Collaborative Research Activities in University Laboratories and its Practical Evaluations,” in 12th International Conference on Information Visualisation, London, UK, 2008, pp. 169–178, https://doi.org/10.1109/IV.2008.15
  15. O. Koyama and Y. Katsuyama, “Design and performance analysis of unified education and research activity support systems over WWW,” Journal of Innovative Computing, Information and Control, vol. 2, no. 4, 2006, pp. 807–818.
  16. K. Nagao, K. Inoue, N. Morita and S. Matsubara, “Automatic Extraction of Task Statements from Structured Meeting Content,” in 7th International Conference on Knowledge Discovery and Information Retrieval, Lisbon, Portugal, 2015, pp. 307–315, http://dx.doi.org/10.5220/0005609703070315
  17. K. Nagao, N. Morita, and S. Ohira, “Evidence-Based Education: Case Study of Educational Data Acquisition and Reuse,” in 8th International Conference on Education, Training and Informatics, Orlando, FL, 2017.
  18. B. Settles, Active Learning Literature Survey, Computer Sciences Technical Report 1648, University of Wisconsin-Madison, 2010.
  19. J. Brooke, “SUS - A quick and dirty usability scale,” Journal of Usability evaluation in industry, vol. 189, no. 194, London, UK, 1996, pp. 4–7.
  20. A. Bangor, P.T. Kortum, and J.T. Miller, “Determining What Individual SUS Scores Mean: Adding an Adjective Rating Scale,” Journal of Usability Studies, vol. 4, no. 3, 2009, pp. 114–123.
  21. J. Sauro, A Practical Guide to the System Usability Scale, Measuring Usability LLC, 2011.