Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article

RagRug: A Toolkit for Situated Analytics

Published: 01 July 2023 Publication History

Abstract

We present RagRug, an open-source toolkit for situated analytics. The abilities of RagRug go beyond previous immersive analytics toolkits by focusing on specific requirements emerging when using augmented reality (AR) rather than virtual reality. RagRug combines state of the art visual encoding capabilities with a comprehensive physical-virtual model, which lets application developers systematically describe the physical objects in the real world and their role in AR. We connect AR visualizations with data streams from the Internet of Things using distributed dataflow. To this end, we use reactive programming patterns so that visualizations become context-aware, i.e., they adapt to events coming in from the environment. The resulting authoring system is low-code; it emphasises describing the physical and the virtual world and the dataflow between the elements contained therein. We describe the technical design and implementation of RagRug, and report on five example applications illustrating the toolkit’s abilities.

References

[1]
Vega-Lite – A Grammar of interactive graphics, Accessed: Mar. 28, 2021. [Online]. Available: https://vega.github.io/vega-lite/
[2]
C. Andrews and C. North, “Analyst’s workspace: An embodied sensemaking environment for large, high-resolution displays,” in Proc. IEEE Conf. Vis. Analytics Sci. Technol., 2012, pp. 123–131.
[3]
M. Back et al., “The virtual chocolate factory: Building a real world mixed-reality system for industrial collaboration and control,” in Proc. IEEE Int. Conf. Multimedia Expo, 2010, pp. 1160–1165.
[4]
S. K. Badam, F. Amini, N. Elmqvist, and P. Irani, “Supporting visual exploration for multiple users in large display environments,” in Proc. IEEE Conf. Vis. Analytics Sci. Technol., 2017, pp. 1–10.
[5]
S. K. Badam and N. Elmqvist, “Visfer: Camera-based visual data transfer for cross-device visualization,” Inf. Visual., vol. 18, no. 1, pp. 68–93, 2019.
[6]
R. Ball and C. North, “Realizing embodied interaction for visual analytics through large displays,” Comput. Grap., vol. 31, no. 3, pp. 380–400, 2007.
[7]
M. Beaudouin-Lafon, “Instrumental interaction: An interaction model for designing post-WIMP user interfaces,” in Proc. SIGCHI Conf. Hum. Factors Comput. Syst., 2000, pp. 446–453.
[8]
P. Belimpasakis and R. Walsh, “A combined mixed reality and networked home approach to improving user interaction with consumer electronics,” IEEE Trans. Consum. Electron., vol. 57, no. 1, pp. 139–144, Feb. 2011.
[9]
J. Bertin. Semiology of Graphics. Madison, WI, USA: Univ. Wisconsin Press, 1983.
[10]
M. Bostock and J. Heer, “Protovis: A graphical toolkit for visualization,” IEEE Trans. Vis. Comput. Graphics, vol. 15, no. 6, pp. 1121–1128, Nov./Dec. 2009.
[11]
M. Bostock, V. Ogievetsky, and J. Heer, “D3 data-driven documents,” IEEE Trans. Vis. Comput. Graphics, vol. 17, pp. 2301–2309, Dec. 2011.
[12]
F. Buschmann, R. Meunier, H. Rohnert, P. Sommerlad, and M. Stal, “Pattern-oriented software architecture - Volume1,” A System of Patterns. Hoboken, NJ, USA: Wiley, 1996.
[13]
P. W. S. Butcher, N. W. John, and P. D. Ritsos, “VRIA: A web-based framework for creating immersive analytics experiences,” IEEE Trans. Vis. Comput. Graphics, vol. 27, no. 7, pp. 3213–3225, Jul. 2021.
[14]
Z. Chen, Y. Su, Y. Wang, Q. Wang, H. Qu, and Y. Wu, “MARVisT: Authoring glyph-based visualization in mobile augmented reality,” IEEE Trans. Vis. Comput. Graphics, vol. 26, no. 8, pp. 2645–2658, Aug. 2020.
[15]
Z. Chen, W. Tong, Q. Wang, B. Bach, and H. Qu, “Augmenting static visualizations with PapARVis designer,” in Proc. SIGCHI Conf. Hum. Factors Comput. Syst., 2020, pp. 1–12.
[16]
M. Cordeil et al., “IATK: An immersive analytics toolkit,” in Proc. IEEE Conf. Virtual Reality 3D User Interfaces, 2019, pp. 200–209.
[17]
N. Elmqvist, A. V. Moere, H.-C. Jetter, D. Cernea, H. Reiterer, and T. Jankun-Kelly, “Fluid interaction for information visualization,” Inf. Visual., vol. 10, no. 4, pp. 327–340, 2011.
[18]
N. ElSayed, B. Thomas, K. Marriott, J. Piantadosi, and R. Smith, “Situated Analytics,” in Proc. Conf. Big Data Vis. Anal., 2015, pp. 1–8.
[19]
B. Ens, F. Anderson, T. Grossman, M. Annett, P. Irani, and G. Fitzmaurice, “Ivy: Exploring spatially situated visual programming for authoring and understanding intelligent environments,” in Proc. 43rd Graph. Interface Conf., 2017, pp. 156–162.
[20]
O. Erat, W. A. Isop, D. Kalkofen, and D. Schmalstieg, “Drone-augmented human vision: Exocentric control for drones exploring hidden areas,” IEEE Trans. Vis. Comput. Graphics, vol. 24, no. 4, pp. 1437–1446, Apr. 2018.
[21]
J.-D. Fekete, “The InfoVis toolkit,” in Proc. IEEE Symp. Inf. Visual., 2004, pp. 167–174.
[22]
E. Gamma, R. Helm, R. Johnson, and J. M. Vlissides. Design Patterns: Elements of Reusable Object-Oriented Software. Reading, MA, USA: Addison-Wesley Professional, 1994.
[23]
M. Gandy and B. MacIntyre, “Designer’s augmented reality toolkit, ten years later: Implications for new media authoring tools,” in Proc. 27th Annu. ACM Symp. User Interface Softw. Technol., 2014, pp. 627–636.
[24]
J. A. Garcia-Macias, J. Alvarez-Lozano, P. Estrada, and E. Aviles Lopez, “Browsing the Internet of Things with sentient visors,” IEEE Comput., vol. 44, no. 5, pp. 46–52, May 2011.
[25]
J. Grubert, M. Pahud, M. Kranz, and D. Schmalstieg, “GlassHands: Interaction around unmodified mobile devices using sunglasses,” in Proc. ACM Int. Conf. Interactive Surfaces Spaces, 2016, pp. 215–224.
[26]
A. S. Gunnarsson, M. Rauhala, A. Henrysson, and A. Ynnerman, “Visualization of sensor data using mobile phone augmented reality,” in Proc. IEEE/ACM Int. Symp. Mixed Augmented Reality, 2007, pp. 233–234.
[27]
C. Harrison, H. Benko, and A. D. Wilson, “OmniTouch: Wearable multitouch interaction everywhere,” in Proc. 27th Annu. ACM Symp. User Interface Softw. Technol., 2011, pp. 441–450.
[28]
J. Heer, S. K. Card, and J. A. Landay, “Prefuse: A toolkit for interactive information visualization,” in Proc. SIGCHI Conf. Hum. Factors Comput. Syst., 2005, pp. 421–430.
[29]
B. Herbert, B. Ens, A. Weerasinghe, M. Billinghurst, and G. Wigley, “Design considerations for combining augmented reality with intelligent tutors,” Comput. Graph., vol. 77, pp. 166–182, 2018.
[30]
D. Herr, J. Reinhardt, R. Krueger, G. Reina, and T. Ertl, “Immersive visual analytics for modular factory layout planning.,” in Proc. IEEE Immersive Anal. Workshop, 2017, pp. 1112–1117.
[31]
G. Hillar. MQTT Essentials - A. Lightweight IoT Protocol. Birmingham, U.K.: Packt Publishing, 2017.
[32]
S. Hubenschmid, J. Zagermann, S. Butscher, and H. Reiterer, “STREAM: Exploring the combination of spatially-aware tablets with augmented reality head-mounted displays for immersive analytics,” in Proc. SIGCHI Conf. Hum. Factors Comput. Syst., 2021, pp. 1–14.
[33]
K. Huo, Y. Cao, S. H. Yoon, Z. Xu, G. Chen, and K. Ramani, “Scenariot: Spatially mapping smart things within augmented reality scenes,” in Proc. SIGCHI Conf. Hum. Factors Comput. Syst., 2018, pp. 1–13.
[34]
Y. Jansen and P. Dragicevic, “An interaction model for visualizations beyond the desktop,” IEEE Trans. Vis. Comput. Graphics, vol. 19, no. 12, pp. 2396–2405, Dec. 2013.
[35]
D. Jo and G. J. Kim, “ARIoT: Scalable augmented reality framework for interacting with Internet of Things appliances everywhere,” IEEE Trans. Consum. Electron. (TCE), vol. 62, no. 3, pp. 334–340, Aug. 2016.
[36]
A. Jones and F. Berthaut, “Controllar: Appropriation of visual feedback on control surfaces,” in Proc. ACM Conf. Interactive Surfaces Spaces, 2016, pp. 465–468.
[37]
S. Kasahara, R. Niiyama, V. Heun, and H. Ishii, “ExTouch: Spatially-aware embodied manipulation of actuated objects mediated by augmented reality,” in Proc. 7th Int. Conf. Tangible, Embedded Embodied Interaction, 2013, pp. 223–226.
[38]
G. R. King, W. Piekarski, and B. H. Thomas, “ARVino - outdoor augmented reality visualisation of viticulture GIS data,” in Proc. IEEE/ACM Int. Symp. Mixed Augmented Reality, 2005, pp. 52–55.
[39]
J. Lacoche, T. Duval, B. Arnaldi, E. Maisel, and J. Royan, “A survey of plasticity in 3D user interfaces,” in Proc. IEEE Workshop Softw. Eng. Archit. Realtime Interactive Syst., 2014, pp. 19–26.
[40]
J. I. Larregui et al., “Immersive analytics for geology: Field sketch-like visualization to assist geological structure analysis during fieldwork,” Tech. Rep., 2018.
[41]
F. Ledermann and D. Schmalstieg, “APRIL - A high level framework for creating augmented reality presentations,” in Proc. IEEE Virtual Reality, 2005, pp. 187–194.
[42]
D. Ledo, S. Greenberg, N. Marquardt, and S. Boring, “Proxemic-aware controls: Designing remote controls for ubiquitous computing ecologies,” in Proc. 17th Int. Conf. Hum.-Comput. Interaction Mobile Devices Serv., 2015, pp. 187–198.
[43]
B. Lee, X. Hu, M. Cordeil, A. Prouzeau, B. Jenny, and T. Dwyer, “Shared surfaces and spaces: Collaborative data visualisation in a co-located immersive environment,” IEEE Trans. Vis. Comput. Graphics, vol. 27, no. 2, pp. 1171–1181, Feb. 2021.
[44]
B. Lee, P. Isenberg, N. H. Riche, and S. Carpendale, “Beyond mouse and keyboard: Expanding design considerations for information visualization interactions,” IEEE Trans. Vis. Comput. Graphics, vol. 18, no. 12, pp. 2689–2698, Dec. 2012.
[45]
Z. Li, M. Annett, K. Hinckley, K. Singh, and D. Wigdor, “Holodoc: Enabling mixed reality workspaces that harness physical and digital content,” in Proc. SIGCHI Conf. Hum. Factors Comput. Syst., 2019, pp. 1–14.
[46]
C. Liu, S. Huot, J. Diehl, W. E. MacKay, and M. Beaudouin-Lafon, “Evaluating the benefits of real-time feedback in mobile augmented reality with hand-held devices,” in Proc. SIGCHI Conf. Hum. Factors Comput. Syst., 2012, pp. 2973–2976.
[47]
B. MacIntyre, A. Hill, H. Rouzati, M. Gandy, and B. Davidson, “The argon ar web browser and standards-based ar application environment,” in Proc. IEEE/ACM Int. Symp. Mixed Augmented Reality, 2011, pp. 65–74.
[48]
A. MacWilliams, C. Sandor, M. Wagner, M. Bauer, G. Klinker, and B. Bruegge, “Herding sheep: Live system development for distributed augmented reality,” in Proc. IEEE/ACM Int. Symp. Mixed Augmented Reality, 2003, pp. 123–132.
[49]
L. Merino et al., “Toward agile situated visualization: An exploratory user study,” in Proc. CHI Extended Abstr. Conf. Hum. Factors Comput. Syst., 2020, pp. 1–7.
[50]
P. Milgram and F. Kishino, “A taxonomy of mixed reality visual displays,” IEICE Trans. Inf. Syst., vol. E77-D, no. 12, pp. 1321–1329, 1994.
[51]
M. Nebeling et al., “MRAT: The mixed reality analytics toolkit,” in Proc. SIGCHI Conf. Hum. Factors Comput. Syst., 2020, p. 1–12.
[52]
N. Norouzi, G. Bruder, B. Belna, S. Mutter, D. Turgut, and G. Welch, “A systematic review of the convergence of augmented reality, intelligent virtual agents, and the Internet of Things,” in Proc. Artif. Intell. IoT, 2019, pp. 1–24.
[53]
D. R. Olsen, “Evaluating user interface systems research,” in Proc. 27th Annu. ACM Symp. User Interface Softw. Technol., 2007, pp. 251–258.
[54]
B. Pokric, S. Krco, and M. Pokric, “Augmented reality based smart city services using secure IoT infrastructure,” in Proc. IEEE Conf. Adv. Inf. Netw. Appl. Workshops, 2014, pp. 803–808.
[55]
A. Prouzeau, Y. Wang, B. Ens, W. Willett, and T. Dwyer, “Corsican twin: Authoring in situ augmented reality visual-isations in virtual reality,” in Proc. Int. Conf. Adv. Vis. Interfaces, 2020, pp. 1–9.
[56]
M. Quigley et al., “ROS: An open-source robot operating system,” in Proc. ICRA Workshop Open Source Softw., 2009, Art. no.
[57]
P. Reipschlager, T. Flemisch, and R. Dachselt, “Personal augmented reality for information visualization on large interactive displays,” IEEE Trans. Vis. Comput. Graphics, vol. 27, no. 2, pp. 1182–1192, Feb. 2021.
[58]
G. Reitmayr and D. Schmalstieg, “An open software architecture for virtual reality interaction,” in Proc. ACM Symp. Virtual Reality Softw. Technol., 2001, p. 47–54.
[59]
D. Salber, A. K. Dey, and G. D. Abowd, “The context toolkit: Aiding the development of context-enabled applications,” in Proc. SIGCHI Conf. Hum. Factors Comput. Syst., 1999, p. 434–441.
[60]
A. Satyanarayan, R. Russell, J. Hoffswell, and J. Heer, “Reactive vega: A streaming dataflow architecture for declarative interactive visualization”, IEEE Trans. Vis. Comput. Graphics, vol. 22, no. 1, pp. 659–668, Jan. 2016.
[61]
D. Schmidt, D. Molyneaux, and X. Cao, “PICOntrol: Using a handheld projector for direct control of physical devices through visible light,” in Proc. 25th Annu. ACM Symp. User Interface Softw. Technol., 2012, pp. 379–388.
[62]
R. Sicat et al., “DXR: A toolkit for building immersive data visualizations,” IEEE Trans. Vis. Comput. Graphics, vol. 25, no. 1, pp. 715–725, Jan. 2019.
[63]
J. C. Spohrer, “Information in places,” IBM Syst. J., vol. 38, no. 4, pp. 602–628, Dec. 1999.
[64]
H. Subramonyam, S. M. Drucker, and E. Adar, “Affinity lens data-assisted affinity diagramming with augmented reality,” in Proc. SIGCHI Conf. Hum. Factors Comput. Syst., 2019, pp. 1–13.
[65]
R. Suzuki, K. Masai, and M. Sugimoto. “ReallifeEngine: A mixed reality-based visual programming system for smarthomes,” in Proc. Int. Conf. Artif. Reality Telexistence Eurographics Symp. Virtual Environ., 2019, pp. 105–112.
[66]
R. M. Taylor, T. C. Hudson, A. Seeger, H. Weber, J. Juliano, and A. T. Helser, “VRPN: A device-independent, network-transparent vr peripheral system,” in Proc. ACM Symp. Virtual Reality Softw. Technol., 2001, pp. 55–61.
[67]
B. H. Thomas et al., Situated Analytics. Berlin, Germany: Springer, 2018, pp. 185–220.
[68]
E. Veas, R. Grasset, I. Ferencik, T. Gruenewald, and D. Schmalstieg, “Mobile Augmented Reality for Environmental Monitoring,” Pers. Ubiquitous Comput., vol. 17, pp. 1515–1531, 2012.
[69]
J. A. Walsh and B. H. Thomas, “Visualising environmental corrosion in outdoor augmented reality,” in Proc. Australas. User Interface Conf., 2011, pp. 39–46.
[70]
Z. Wan, W. Taha, and P. Hudak, “Event-Driven FRP,” in Lecture Notes in Computer Science, vol. 2257. Berlin, Germany: Springer-Verlag, 2002, pp. 155–172.
[71]
S. White and S. Feiner, “Sitelens: Situated visualization techniques for urban site visits,” in Proc. SIGCHI Conf. Hum. Factors Comput. Syst., 2009, pp. 1117–1120.
[72]
W. Willett, Y. Jansen, and P. Dragicevic, “Embedded data representations”, IEEE Trans. Vis. Comput. Graphics, vol. 23, no. 1, pp. 461–470, Jan. 2017.
[73]
R. Xiao, C. Harrison, and S. E. Hudson, “WorldKit: Rapid and easy creation of ad-hoc interactive applications on everyday surfaces,” in Proc. SIGCHI Conf. Hum. Factors Comput. Syst., 2013, pp. 879–888.
[74]
R. Xiao, J. Schwarz, N. Throm, A. D. Wilson, and H. Benko, “MRTouch: Adding touch input to head-mounted mixed reality,” IEEE Trans. Vis. Comput. Graphics, vol. 24, no. 4, pp. 1653–1660, Apr. 2018.
[75]
J. S. Yi, Y. A. Kang, J. T. Stasko, and J. A. Jacko, “Toward a deeper understanding of the role of interaction in information visualization,” IEEE Trans. Vis. Comput. Graphics, vol. 13, no. 6, pp. 1224–1231, Nov. 2007.
[76]
B. Yost and C. North, “The perceptual scalability of visualization,” IEEE Trans. Vis. Comput. Graphics, vol. 12, no. 5, pp. 837–844, Sep. 2006.
[77]
L. Zhang and S. Oney, “FlowMatic: An immersive authoring tool for creating interactive scenes in virtual reality,” in Proc. 27th Annu. ACM Symp. User Interface Softw. Technol., 2020, pp. 342–353.
[78]
M. Zhao, Y. Su, J. Zhao, S. Chen, and H. Qu, “Mobile situated analytics of ego-centric network data,” in Proc. ACM SIGGRAPH Asia Symp. Visual., 2017, pp. 1–8.
[79]
S. Zollmann, C. Hoppe, T. Langlotz, and G. Reitmayr, “FlyAR: Augmented reality supported micro aerial vehicle navigation,” IEEE Trans. Vis. Comput. Graphics, vol. 20, no. 4, pp. 560–8, Apr. 2014.
[80]
S. Zollmann, D. Kalkofen, E. Mendez, and G. Reitmayr, “Image-based ghostings for single layer occlusions in augmented reality,” in Proc. IEEE/ACM Int. Symp. Mixed Augmented Reality, 2010, pp. 19–26.

Cited By

View all
  • (2024)Comparison of Spatial Visualization Techniques for Radiation in Augmented RealityProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642646(1-15)Online publication date: 11-May-2024
  • (2024)CompositingVis: Exploring Interactions for Creating Composite Visualizations in Immersive EnvironmentsIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2024.345621031:1(591-601)Online publication date: 9-Sep-2024
  • (2024)VoxAR: Adaptive Visualization of Volume Rendered Objects in Optical See-Through Augmented RealityIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2023.334077030:10(6801-6812)Online publication date: 1-Oct-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image IEEE Transactions on Visualization and Computer Graphics
IEEE Transactions on Visualization and Computer Graphics  Volume 29, Issue 7
July 2023
296 pages

Publisher

IEEE Educational Activities Department

United States

Publication History

Published: 01 July 2023

Qualifiers

  • Research-article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 09 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Comparison of Spatial Visualization Techniques for Radiation in Augmented RealityProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642646(1-15)Online publication date: 11-May-2024
  • (2024)CompositingVis: Exploring Interactions for Creating Composite Visualizations in Immersive EnvironmentsIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2024.345621031:1(591-601)Online publication date: 9-Sep-2024
  • (2024)VoxAR: Adaptive Visualization of Volume Rendered Objects in Optical See-Through Augmented RealityIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2023.334077030:10(6801-6812)Online publication date: 1-Oct-2024
  • (2024): Visualization of AI-Assisted Task Guidance in ARIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2023.332739630:1(1313-1323)Online publication date: 1-Jan-2024
  • (2024)Wizualization: A “Hard Magic” Visualization System for Immersive and Ubiquitous AnalyticsIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2023.332658030:1(507-517)Online publication date: 1-Jan-2024
  • (2024)mint: Integrating scientific visualizations into virtual realityJournal of Visualization10.1007/s12650-024-01011-y27:6(1143-1169)Online publication date: 1-Dec-2024
  • (2023)Exploring Augmented Reality for Situated Analytics with Many Movable Physical ReferentsProceedings of the 29th ACM Symposium on Virtual Reality Software and Technology10.1145/3611659.3615700(1-12)Online publication date: 9-Oct-2023
  • (2023)MR Object Identification and InteractionProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36108797:3(1-26)Online publication date: 27-Sep-2023
  • (2023)Usability Evaluation of an Augmented Reality System for Collaborative Fabrication between Multiple Humans and Industrial RobotsProceedings of the 2023 ACM Symposium on Spatial User Interaction10.1145/3607822.3614528(1-10)Online publication date: 13-Oct-2023
  • (2023)Scene Responsiveness for Visuotactile Illusions in Mixed RealityProceedings of the 36th Annual ACM Symposium on User Interface Software and Technology10.1145/3586183.3606825(1-15)Online publication date: 29-Oct-2023
  • Show More Cited By

View Options

View options

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media