Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2556288.2557090acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

MixFab: a mixed-reality environment for personal fabrication

Published: 26 April 2014 Publication History

Abstract

Personal fabrication machines, such as 3D printers and laser cutters, are becoming increasingly ubiquitous. However, designing objects for fabrication still requires 3D modeling skills, thereby rendering such technologies inaccessible to a wide user-group. In this paper, we introduce MixFab, a mixed-reality environment for personal fabrication that lowers the barrier for users to engage in personal fabrication. Users design objects in an immersive augmented reality environment, interact with virtual objects in a direct gestural manner and can introduce existing physical objects effortlessly into their designs. We describe the design and implementation of MixFab, a user-defined gesture study that informed this design, show artifacts designed with the system and describe a user study evaluating the system's prototype.

Supplementary Material

suppl.mov (pn0736-file3.mp4)
Supplemental video

References

[1]
Autodesk 123D Design. http://www.123dapp.com/design.
[2]
TinkerCAD: Mind to design in minutes. https://tinkercad.com/.
[3]
Anderson, D., Frankel, J. L., Marks, J., Agarwala, A., Beardsley, P., Hodgins, J., Leigh, D., Ryall, K., Sullivan, E., and Yedidia, J. S. Tangible interaction + graphical interpretation: a new approach to 3d modeling. In Proc. SIGGRAPH (2000), 393--402.
[4]
Arisandi, R., Takami, Y., Otsuki, M., Kimura, A., Shibata, F., and Tamura, H. Enjoying virtual handcrafting with tooldevice. In Adjunct Proc. UIST (2012), 17--18.
[5]
Benko, H., Jota, R., and Wilson, A. Miragetable: Freehand interaction on a projected augmented reality tabletop. In Proc. CHI (2012), 199--208.
[6]
Bradski, G., and Kaehler, A. Learning OpenCV: Computer Vision with the OpenCV Library. O'Reilly Media, Incorporated, 2008.
[7]
Buchmann, V., Violich, S., Billinghurst, M., and Cockburn, A. Fingartips: Gesture based direct manipulation in augmented reality. In Proc. GRAPHITE (2004), 212--221.
[8]
Douglas, D. H., and Peucker, T. K. Algorithms for the reduction of the number of points required to represent a digitized line or its caricature. Cartographica: The International Journal for Geographic Information and Geovisualization 10 (1973), 112--122.
[9]
Follmer, S., Carr, D., Lovell, E., and Ishii, H. Copycad: remixing physical objects with copy and paste from the real world. In Adjunct Proc., UIST (2010), 381--382.
[10]
Follmer, S., and Ishii, H. Kidcad: digitally remixing toys through tangible tools. In Proc. CHI (2012), 2401--2410.
[11]
Hilliges, O., Kim, D., Izadi, S., Weiss, M., and Wilson, A. Holodesk: direct 3d interactions with a situated see-through display. In Proc. CHI (2012), 2421--2430.
[12]
Holz, C., and Wilson, A. Data miming: inferring spatial object descriptions from human gesture. In Proc. CHI (2011), 811--820.
[13]
Kim, D., Hilliges, O., Izadi, S., Butler, A. D., Chen, J., Oikonomidis, I., and Olivier, P. Digits: freehand 3d interactions anywhere using a wrist-worn gloveless sensor. In Proc. UIST (2012), 167--176.
[14]
Kim, H., Albuquerque, G., Havemann, S., and Fellner, D. W. Tangible 3d: hand gesture interaction for immersive 3d modeling. In Proc. EGVE (2005), 191--199.
[15]
Lau, M., Hirose, M., Ohgawara, A., Mitani, J., and Igarashi, T. Situated modeling: a shape-stamping interface with tangible primitives. In Proc. TEI (2012), 275--282.
[16]
Lau, M., Saul, G., Mitani, J., and Igarashi, T. Modeling-in-context: user design of complementary objects with a single photo. In Proc. SBIM (2010), 17--24.
[17]
Llamas, I., Kim, B., Gargus, J., Rossignac, J., and Shaw, C. D. Twister: a space-warp operator for the two-handed editing of 3d shapes. In ACM SIGGRAPH (2003), 663--668.
[18]
Mori, Y., and Igarashi, T. Plushie: an interactive design system for plush toys. In ACM SIGGRAPH (2007).
[19]
Mueller, S., Lopes, P., and Baudisch, P. Interactive construction: interactive fabrication of functional mechanical devices. In Proc. UIST (2012), 599--606.
[20]
Nishino, H., Utsumiya, K., and Korida, K. 3d object modeling using spatial and pictographic gestures. In Proc. VRST (1998), 51--58.
[21]
Oka, K., Sato, Y., and Koike, H. Real-time 'ngertip tracking and gesture recognition. Computer Graphics and Applications, IEEE 22, 6 (2002), 64--71.
[22]
Pavlovic, V. I., Sharma, R., and Huang, T. S. Visual interpretation of hand gestures for human-computer interaction: A review. IEEE Trans. Pattern Anal. Mach. Intell. 19, 7 (July 1997), 677--695.
[23]
Saul, G., Lau, M., Mitani, J., and Igarashi, T. Sketchchair: an all-in-one chair design system for end users. In Proc. TEI (2011), 73--80.
[24]
Schkolne, S., Pruett, M., and Schroder, P. Surface drawing: creating organic 3d shapes with the hand and tangible tools. In Proc. CHI (2001), 261--268.
[25]
Sheng, J., Balakrishnan, R., and Singh, K. An interface for virtual 3d sculpting via physical proxy. In Proc. GRAPHITE (2006), 213--220.
[26]
Starner, T., Leibe, B., Minnen, D., Westyn, T., Hurst, A., and Weeks, J. The perceptive workbench: Computer-vision-based gesture tracking, object tracking, and 3d reconstruction for augmented desks. Machine Vision and Applications 14, 1 (2003), 59--71.
[27]
Weichel, C., Lau, M., and Gellersen, H. Enclosed: A component-centric interface for designing prototype enclosures. In Proc. TEI (2013), 215--218.
[28]
Wilson, A. D. Using a depth camera as a touch sensor. In Proc. ITS (2010), 69--72.
[29]
Wobbrock, J. O., Morris, M. R., and Wilson, A. D. User-defined gestures for surface computing. In Proc. CHI (2009), 1083--1092.

Cited By

View all
  • (2024)Blended Physical-Digital Kinesthetic Feedback for Mixed Reality-Based Conceptual Design-In-ContextProceedings of the 50th Graphics Interface Conference10.1145/3670947.3670967(1-16)Online publication date: 3-Jun-2024
  • (2024)Don't Mesh Around: Streamlining Manual-Digital Fabrication Workflows with Domain-Specific 3D ScanningProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676385(1-16)Online publication date: 13-Oct-2024
  • (2024)DisplayFab: The State of the Art and a Roadmap in the Personal Fabrication of Free-Form Displays Using Active Materials and Additive Manufacturing.Proceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642708(1-24)Online publication date: 11-May-2024
  • Show More Cited By

Index Terms

  1. MixFab: a mixed-reality environment for personal fabrication

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    CHI '14: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
    April 2014
    4206 pages
    ISBN:9781450324731
    DOI:10.1145/2556288
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 26 April 2014

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. 3d modeling
    2. 3d printing
    3. direct manipulation
    4. mixed-reality
    5. personal fabrication

    Qualifiers

    • Research-article

    Conference

    CHI '14
    Sponsor:
    CHI '14: CHI Conference on Human Factors in Computing Systems
    April 26 - May 1, 2014
    Ontario, Toronto, Canada

    Acceptance Rates

    CHI '14 Paper Acceptance Rate 465 of 2,043 submissions, 23%;
    Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

    Upcoming Conference

    CHI 2025
    ACM CHI Conference on Human Factors in Computing Systems
    April 26 - May 1, 2025
    Yokohama , Japan

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)171
    • Downloads (Last 6 weeks)13
    Reflects downloads up to 04 Jan 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Blended Physical-Digital Kinesthetic Feedback for Mixed Reality-Based Conceptual Design-In-ContextProceedings of the 50th Graphics Interface Conference10.1145/3670947.3670967(1-16)Online publication date: 3-Jun-2024
    • (2024)Don't Mesh Around: Streamlining Manual-Digital Fabrication Workflows with Domain-Specific 3D ScanningProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676385(1-16)Online publication date: 13-Oct-2024
    • (2024)DisplayFab: The State of the Art and a Roadmap in the Personal Fabrication of Free-Form Displays Using Active Materials and Additive Manufacturing.Proceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642708(1-24)Online publication date: 11-May-2024
    • (2024)GlucoMaker: Enabling Collaborative Customization of Glucose MonitorsProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642435(1-21)Online publication date: 11-May-2024
    • (2024)DungeonMaker: Embedding Tangible Creation and Destruction in Hybrid Board Games through Personal Fabrication TechnologyProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642243(1-20)Online publication date: 11-May-2024
    • (2024)pARam: Leveraging Parametric Design in Extended Reality to Support the Personalization of Artifacts for Personal FabricationProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642083(1-22)Online publication date: 11-May-2024
    • (2024)Towards Safer Mixed Reality: Identifying, Evaluating, and Mitigating Security and Privacy Threats2024 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)10.1109/ISMAR-Adjunct64951.2024.00200(662-663)Online publication date: 21-Oct-2024
    • (2024)Comparison of deviceless methods for distant object manipulation in mixed realityComputers & Graphics10.1016/j.cag.2024.103959122(103959)Online publication date: Aug-2024
    • (2023)Constraint-based bare-hand immersive 3D modellingi-com10.1515/icom-2023-001322:2(125-141)Online publication date: 31-May-2023
    • (2023)A cyber-physical system to design 3D models using mixed reality technologies and deep learning for additive manufacturingPLOS ONE10.1371/journal.pone.028920718:7(e0289207)Online publication date: 27-Jul-2023
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media