Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Download as pdf or txt
Download as pdf or txt
You are on page 1of 39

US010168827B2

(12) United States Patent ( 10) Patent No.: US 10 , 168 ,827 B2


Hinckley et al. (45) Date of Patent: * Jan . 1, 2019
(54 ) SENSOR CORRELATION FOR PEN AND (58 ) Field of Classification Search
TOUCH -SENSITIVE COMPUTING DEVICE CPC .. GO6F 3 /0416 ; G06F 3 /0383 ; G06F 3 /04883;
INTERACTION G06F 3 /03545 ; GO6F 3/ 0482
(71 ) Applicant: Microsoft Technology Licensing, LLC , ( Continued )
Redmond , WA (US) ( 56 ) References Cited
(72 ) Inventors: Ken Hinckley , Redmond , WA (US ); U . S . PATENT DOCUMENTS
Hrvoje Benko , Seattle, WA (US );
Michel Pahud, Kirkland, WA (US ) ; 5 ,149 , 919 A 9 / 1992 Greenies et al.
Andrew D . Wilson , Seattle , WA (US); 5 , 198 ,623 A 3 / 1993 Landmeier
Pourang Polad Irani, Winnipeg (CA ) ; (Continued )
Francois Guimbretiere , Ithaca , NY
(US ) FOREIGN PATENT DOCUMENTS
(73 ) Assignee : MICROSOFT TECHNOLOGY KR 10 - 2012 -0005417 AL 1/ 2012
LICENSING , LLC , Redmond , WA WO 2009084809 A1 7 /2012
(US ) WO 2013054155 Al 4 / 2013

( * ) Notice: Subject to any disclaimer, the term of this OTHER PUBLICATIONS


patent is extended or adjusted under 35
U .S .C . 154 (b ) by 0 days. Aliakseyeu , D ., A . Lucero , S . Subramanian , Interacting with piles of
This patent is subject to a terminal dis artifacts on digital tables , Digital Creativity, Jul. 2007, pp . 161 -174 ,
vol. 18 , No. 3 .
claimer .
(Continued )
(21 ) Appl . No.: 15/640 ,507 Primary Examiner — Michael Pervan
(22) Filed: Jul. 1, 2017 (74 ) Attorney, Agent, or Firm — Alleman Hall Creasman
& Tuttle LLP
(65 ) Prior Publication Data
US 2017 /0300170 A1 Oct. 19, 2017 (57 ) ABSTRACT
Related U .S. Application Data Pen and computing device sensor correlation technique
embodiments correlate sensor signals received from various
(63 ) Continuation of application No. 14 /303 ,203, filed on grips on a touch -sensitive pen and touches to a touch
Jun . 12, 2014, now Pat. No. 9,727 , 161. sensitive computing device in order to determine the context
of such grips and touches and to issue context- appropriate
(51) Int. Ci. commands to the touch sensitive pen or the touch -sensitive
G06F 3 /041 ( 2006 .01) computing device . A combination of concurrent sensor
G06F 3 / 0487 ( 2013 .01) inputs received from both a touch - sensitive pen and a
(Continued ) touch - sensitive computing device are correlated . How the
(52 ) CPC
U .S. CI........... . G06F 3 /0416 ( 2013 .01); G06F 3 /0383
touch - sensitive pen and the touch - sensitive computing
device are touched or gripped are used to determine the
(2013 .01) ; G06F 3/03545 (2013 .01) ; context of their use and the user' s intent. A context-appro
(Continued ) (Continued )
Sensor Module 508
Sensor Monitormadings of one or more
Pen sensors coupled to sensor pen
600 ~
502
Communications Corrmunication Module 510
Link
Send sensor Negding to
506 . . computing device
Touch - Sensitive
Computing Device

Sensor Per Input Module 512 Cotiputing Device Touch Input Module 514
Receive inputoffrom one or more sensors Receive Input from one ormore touch senslove
sensor pan . Sustaces of compuong device

BpDatabase Grip and Touch Datarminton Modulo 515


512 Determines the gripitouch patterns on the touch sensitive computing
Known Grips device and the touch patlemg on the sensor pen . Conslates these
with other pensar Input daslred .

Context Determination tooule 520 Meta


Correlates the gre patterns an the sergor pen with the grip and touch patteme on displays or other Dato
touch senordve surfaces of compuong device, and possibly other sensor data , to determine context Labeler
of Use and user intent. Determine the preferred hand /non - prafered and of the user, 524

Command inidadon Module 522


Generates appropriate commands based the determined context and user Intent
. .
Advanced . Magnifier
Paimii Drafting Pan / Thurtib Thuthi Pen Canvas Drating User
Rejection als
Loupe Zoom : Contact Menu Tod 's Tools Tools Defined or
Tools Tool Rejection 536 639 540 ) Other
520 5327 544 634 542 )
530
US 10 ,Page
168,2827 B2

priate user interface action based can then be initiated. Also 2006 / 0012580 A1 7 /2006 Perski et al.
the context can be used to label metadata . 2006 /0146038 A1 7 /2006 Park et al.
2006 /0177112 A1 8 /2006 Yang et al.
2006 /0197750 A1 9 /2006 Kerr et al.
19 Claims, 14 Drawing Sheets 2006 /0197753 Al 9 /2006 Hotelling
2006 / 0256008 Al 11/ 2006 Rosenberg
2006 / 0267957 A 11 /2006 Kolmykov -Zotov et al.
2006 /0267958 AL 11/ 2006 Kolmykov - Zotov et al.
2007/ 0002016 A11 / 2007 Cho et al.
(51) Int. Ci. 2007/ 0070051 A1 3 /2007 Westerman et al.
GOOF 3 /0488 ( 2013 . 01) 2007/0075965 Al 4 /2007 Huppi et al.
G06F 3 /0354 (2013 .01) 2007 /0113198 A1 5 /2007 Robertson et al.
2007 /0126732 Al 6 / 2007 Robertson et al.
G06F 37038 ( 2013 . 01) 2007/0152976 A1 7 / 2007 Townsend et al.
G06F 3 / 0482 ( 2013 . 01 ) 2007/0182663 A1 8 /2007 Biech
(52 ) U .S . CI. 2007/0188477 A1 8 / 2007 Rehm
2007 /0198950 A1 8 /2007 Dodge et al.
CPC ......... GO6F 3/0482 ( 2013.01); G06F 3/0487 2007/ 0247441 Al 10 / 2007 Kim et al.
(2013 .01) ; G06F 3/ 04883 (2013 .01); G06F 2007 /0268274 Al 11 /2007 Westerman et al.
2203 /04104 (2013 .01); G06F 2203/04808 2008/ 0002888 A1
2008/ 0012835 Al
1/ 2008 Yuan
1 /2008 Rimon et al .
(2013 .01 ) 2008 /0040692 A1 2 /2008 Sunday et al.
(58) Field of Classification Search 2008 /0046425 A1 2 /2008 Perski
USPC ..................... 178 / 18 .01 – 19 .07 ; 345 /173 – 183 2008 /0055278 A1 3 / 2008 Locker et al.
See application file for complete search history . 2008 /0106520 A1 5 / 2008 Free et al.
2008/0158145 Al 7 / 2008 Westerman
2008/0158168 A1 7 /2008 Westerman et al .
( 56 ) References Cited 2008/0163130 A1 7 /2008 Westerman
2008/0191898 A1 8 /2008 Janik
U .S . PATENT DOCUMENTS 2008 /0259043 AL 10 /2008 Buil et al.
5 ,414 ,227 A 2008/0292195 Al 11/ 2008 Vijayasenan et al .
5 / 1995 Schubert et al. 2009/0066663 A1 3 /2009 Chang et al .
5 , 463, 725 A 10 / 1995 Henckel et al. 2009 /0073144 A1 3 /2009 Chen et al.
5 ,625, 833 A 4 / 1997 Levine et al. 2009 /0083847 Al 3 /2009 Fadell et al.
5 , 778 ,404 A 7 / 1998 Capps et al. 2009/0100384 Al 4 / 2009 Louch
5 , 867, 163 A 2 / 1999 Kurtenbach 2009 /0109182 A1 4 / 2009 Fyke et al .
5 ,914 ,701 A 6 / 1999 Gercheneld et al. 2009/0153525 Al 6 /2009 Chang
5 ,956 , 020 A 9 / 1999 D ' Amico et al. 2009/0160816 A1 6 /2009 Westerman et al.
6 , 307 ,548 B1 10 /2001 Flinchem et al. 2009 /0167702 A1 7 /2009 Numi
6 ,788, 292 B1 9 / 2004 Nako et al. 2009/0178007 A1 7 / 2009 Matas et al.
6 , 906 ,703 B2 6 /2005 Vablais et al. 2009/0209285 Al 8 / 2009 McMahan
7 ,231,609 B2 6 / 2007 Baudisch 2009 /0228842 AL 9 / 2009 Westerman et al.
7 ,289, 102 B2 10 /2007 Hinckley et al. 2009 /0259969 A1 10 /2009 Pallakoff
7 , 362 ,221 B2 4 / 2008 Katz 2009/ 0262074 A1 10 / 2009 Nasiri et al .
7 , 400 ,316 B2 7 /2008 Appleyard et al. 2009/0265671 A1 10 /2009 Sachs et al.
7 , 499 ,024 B2 3 / 2009 Johnston , Jr. et al. 2009 /0267896 Al 10 /2009 Hiramatsu
7 ,532 , 196 B2 5 / 2009 Hinckley 2010 /0007618 A1 1 / 2010 Park et al.
7 , 567,242 B2 7 / 2009 Perkins et al. 2010 /0020025 A1 1/2010 Lemort et al.
7 ,703 ,047 B2 4 / 2010 Keely, Jr. et al. 2010 /0045705 Al 2 /2010 Vertegaal et al.
7 , 812 , 826 B2 10 /2010 Ording et al. 2010 /0053095 Al 3 / 2010 Wu et al.
7 ,812 ,828 B2 10 / 2010 Westerman et al. 2010 /0053120 A1 3 /2010 Chang et al.
7 , 847 ,789 B2 12 / 2010 Kolmykov -Zotov et al. 2010 / 0079493 A1 4 /2010 Tse et al .
7 ,956 ,847 B2 6 / 2011 Christie 2010 /0083191 A1 4 / 2010 Marshall
7 ,982, 739 B2 7 /2011 Pasula 2010 /0085317 A1 4 /2010 Park
8 , 228 ,292 B1 7 / 2012 Ruiz et al. 2010 /0095234 A1 4 /2010 Lane et al.
8 , 265, 705 B2 9 / 2012 Lee 2010 /0103117 A1 4 /2010 Townsend et al.
8 , 360,669 B2 1/ 2013 Underwood et al. 2010 /0103118 A1 4 / 2010 Townsend et al.
8 ,413,077 B2 4 / 2013 Yamamoto et al. 2010 /0123737 Al 5 / 2010 Williamson et al.
8 ,660 ,978 B2 2 /2014 Hinckley et al. 2010 /0127979 Al 5 /2010 Lee et al.
8 ,982 ,045 B2 3/ 2015 Hinckley et al. 2010 /0139990 A1 6 / 2010 Westerman et al.
8 , 988, 398 B2 3 / 2015 Cao et al. 2010 /0156941 Al 6/2010 Seung
9 ,244 ,545 B2 1/ 2016 Hinckley et al. 2010 /0175018 Al 7 /2010 Petschnigg et al.
2003/0076310 Al 4 / 2003 Kanzaki et al. 2010 /0177121 A1 7 /2010 Homma et al .
2003 /0164821 A1 9 / 2003 Rezania 2010 /0188328 Al 7 /2010 Dodge et al.
2004/ 0012575 Al 1 / 2004 Homer et al. 2010 /0194547 A1 8 / 2010 Terrell et al.
2004 /0047505 Al 3 / 2004 Ghassabian 2010 / 02 14216 A1 8 / 2010 Nasiri et al.
2004/ 0073432 A1 4 /2004 Stone 2010 /0235729 A1 9/ 2010 Kociendaq et al.
2004 /0140962 A1 7 / 2004 Wang et al. 2010 /0281435 A 11/ 2010 Bangalore et al .
2004 /0189594 A1 9 /2004 Sterling 2010 /0295781 AL 11/2010 Alameh et al.
2004 /0203520 Al 10 / 2004 Schirtzinger et al. 2010 /0295799 AL 11/2010 Nicholson et al.
2005 /0024346 A1 2 / 2005 Dupraz et al . 2010 /0298033 Al 11/2010 Lee
2005/ 0052427 Al 3 / 2005 Wu et al. 2010 /0306670 Al 12 /2010 Quinn et al.
2005/ 0079896 A1 4 / 2005 Kokko et al . 2010 /0328227 A1 12 /2010 Matejka et al.
2005/0165839 A1 7 /2005 Madan et al.
2005 /0179648 A1 8 /2005 Barabe et al . 2011/0115741 A1 5 /2011 Lukas et al.
2005/ 0216867 AL 9 / 2005 Marvit et al. 2011/0134026 A1 6 / 2011 Kang et al.
2005/0253817 A1 11/ 2005 Rytivaara et al. 2011 /0163955 A1 7 / 2011 Nasiri et al.
2006 /0026535 Al 2 / 2006 Hotelling et al. 2011 /0167357 A1 7 / 2011 Benjamin et al.
2006 /0109252 A1 5 / 2006 Kolmykov -Zotov et al. 2011/0187651 A1 8 / 2011 Whitlow et al.
2006 / 0136840 A1 6 /2006 Keely, Jr. et al. 2011/0193788 AL 8/2011 King et al.
US 10 ,Page
168,3827 B2

( 56 ) References Cited Goel, et al., “ WalkType: Using Accelerometer Data to Accomodate


Situational Impairments in Mobile Touch Screen Text Entry ” , In
U . S . PATENT DOCUMENTS Proceedings of the SIGCHI Conference on Human Factors in
Computing Systems, May 5 , 2012 , 10 pages.
2011/0197153 AL 8/ 2011 King et al. Grossman , et al., “ Hover Widgets: Using the Tracking State to
2011/0221777 A1 9 / 2011 Ke Extend the Capabilities of Pen -Operated Devices” , In Proceedings
2011/0231796 Al 9 / 2011 Vigil of the SIGCHI Conference on Human Factors in Computing Sys
2011 /0239110 A1 9 /2011 Garrett et al .
2012 /0092268 A1 4 / 2012 Tsai et al. tems, Apr. 22 , 2006 , 10 pages .
2012 / 0092269 Al 4 / 2012 Tsai et al. Hinckley, et al., “ Touch -Sensing Input Devices” , In Proceedings of
2012 /0154293 A1 6 / 2012 Hinckley et al. the SIGCHI Conference on Human Factors in Computing Systems,
2012 /0154294 Al 6 / 2012 Hinckley et al. May 15 , 1999, 8 pages.
2012 /0154295 Al 6 / 2012 Hinckley et al. Kurtenbach , et al., “ Issues in Combining Marking and Direct
2012 /0154296 A1 6 /2012 Hinckley et al. Manipulation Techniques” , In Proceedings of the 4th Annual ACM
2012 /0158629 Al 6 / 2012 Hinckley et al.
2012 /0206330 Al 8/ 2012 Cao et al. Symposium on User Interface Software and Technology , Nov. 11 ,
2012 / 0235904 A1 9 / 2012 Plagemann 1991, 8 pages.
2012 / 0242598 A19 / 2012 Won et al. Lee, et al., “ HandSCAPE : A vectorizing tape measure for on -site
2012 / 0260220 Al 10 / 2012 Griffin measuring applications” , Proceedings of the CHI 2000 Conference
2012 / 0262407 Al 10 / 2012 Hinckley et al. on Human factors in computing systems, CHI 2000 , Apr. 1 -6 , 2000 ,
2012 /0306927 Al 12 / 2012 Lee et al. pp . 137 - 144 , The Hague, The Netherlands .
2012 / 0313865 Al 12 / 2012 Pearce Li, et al., “ Experimental Analysis of Mode Switching Techniques in
2012 / 0327040 A112/ 2012 Simon et al.
2012 /0327042 Al 12 / 2012 Harley et al. Pen -Based User Interfaces” , In Proceedings of the SIGCHI Con
2012 / 0331546 A112/ 2012 Falkenburg et al. ference on Human Factors in Computing Systems, Apr. 2 , 2005 , 10
2013 /0016055 A11 / 2013 Chuang pages .
2013 /0106725 Al 5 /2013 Bakken et al. Li, et al., “ The 1Line Keyboard : A QWERTY Layout in a Single
2013 /0106740 A1 5 /2013 Yilmaz et al. Line ” , In Proceedings of the 24th Annual ACM Symposium on User
2013/0106777 Al 5 / 2013 Yilmaz et al. Interface Software and Technology, Oct. 16 , 2011 , 10 pages .
2013/0120281 A1 5 /2013 Harris
2013 /0159939 A1 6 / 2013 Krishnamurthi Ramos, et al., “ Pressure Widgets” , In Proceedings of the SIGCHI
2013 /0154952 A1 7 / 2013 Hinckley et al. Conference on Human Factors in Computing Systems, vol. 6 , Issue
2013 /0181902 Al 7 /2013 Hinckley et al. 1 , Apr. 24 , 2004, 8 pages .
2013 /0181948 A1 7 / 2013 Sakai Ramos, et al., “ Tumble ! Splat ! Helping Users Access and Manipu
2013 /0201113 A18 / 2013 Hinckley et al. late Occluded Content in 2D Drawings” , In Proceedings of the
2013 /0257777 Al 10 / 2013 Banks et al. Working Conference on Advanced Visual Interfaces,May 23 , 2006 ,
2013 /0335333 Al 12 /2013 Kukulski et al. 8 pages.
2014 /0073432 A1 3 / 2014 Lu et al.
2014 /0078117 A 3 /2014 Asano Rekimoto, Jun , “ Tilting Operations for Small Screen Interfaces” , In
2014 /0104211 A1 4 / 2014 Harris Proceedings of the 9th Annual ACM Symposium on User Interface
2014 /0108979 A1 4 /2014 Davidson et al. Software and Technology, Nov . 6 , 1996 , 2 pages
2014 /0210797 A1 7 /2014 Kreek et al. Ruiz , et al., “ DoubleFlip : A Motion Gesture Delimiter for Mobile
2014 /0253522 A1 9 / 2014 Cueto Interaction ” , In Proceedings of the SIGCHI Conference on Human
2014 /0267025 AL 9 /2014 Kim Factors in Computing Systems, May 7 , 2011, 4 pages.
Sachs, et al., “ 3 -Draw : A Tool for Designing 3D Shapes” , In Journal
OTHER PUBLICATIONS ofIEEE Computer Graphics and Applications, vol. 11 , Issue 6 , Nov .
1991, 9 pages.
Balakrishnan , et al., Digital tape drawing , Proceedings of the 12th Subramanian , et al., “ Multi-layer interaction for digital tables," In
Annual ACM Symposium on User Interface Software and Technol Proc . of the 19th Annual ACM Symposium on User Interface
ogy, ACM Symposium on User Interface Software and Technology , Software and Technology , Oct. 15 , 2006 , pp . 269 - 272 .
UIST ' 99 , Nov . 7 - 10 , 1999 , pp . 161- 169, Asheville, USA . Tashman , et al., “ LiquidText: A Flexible , Multitouch Environment
to Support Active Reading” , In Proceedings of the SIGCHI Con
Balakrishnan , et al., “ The Rockin 'Mouse : Integral 3D Manipulation ference on Human Factors in Computing Systems, May 7 , 2011 , 10
on a Plane” , In Proceedings of the ACM SIGCHI Conference on pages.
Human Factors in Computing Systems, Mar. 22 , 1997 , 8 pages . Tian, et al., “ The Tilt Cursor: Enhancing Stimulus-Response Com
Brandl, et al., “ Combining and Measuring the Benefits of Bimanual patibility by Providing 3D Orientation Cue of Pen ” , In Proceedings
Pen and Direct- Touch Interaction on Horizontal Interfaces” , In of the SIGCHI Conference on Human Factors in Computing Sys
Proceedings of the Working Conference on Advanced Visual Inter tems, Apr. 28 , 2007 , 4 pages .
faces , May 28 , 2008 , 10 pages. Verplaetse , C ., “ Inertial Proprioceptive Devices: Self-Motion
Buxton , William A . S ., “ A Three -State Model ofGraphical Input” , In Sensing Toys and Tools” , In IBM Systems Journal, vol. 35 , Issue
Proceedings of the IFIP TC13 Third Interational Conference on 3 -4 , Apr. 23 , 2013, 12 pages .
Human - Computer Interaction , Aug. 27 , 1990 , 11 pages . Wilson et al., “ XWand : UI for Intelligent Spaces ” , Proc . of the 2003
Card , S . K ., J. D . Mackinlay , G . G . Robertson , The design space of Conf. on Human Factors in Computing Sys' s, CHI 2003, Apr. 5 - 10 ,
input devices, CHI 1990 , Apr. 1990 , pp . 117 - 124 , Seattle, WA , 2003 , pp . 545 - 552 , Ft. Lauderdale, Florida , USA .
USA . Wu et al., “Gesture Registration , Relaxation , and Reuse for Multi
Chu , et al., “ Detail-preserving paint modeling for 3D brushes” , Point Direct - Touch Surfaces ” , First IEEE Int'l Workshop on Hori
Proc . of the 8th Int 'l Symposium on Non - Photorealistic Animation zontal Interactive Human -Comp. Sys's, Tabletop 2006 , Jan . 2006 ,
and Rendering 2010 , NPAR 2010 , Jun . 7 - 10 , 2010 , pp . 27 - 34 , pp . 185 - 192 , Adelaide , Australia .
Annecy, France . Xin et al., “ Acquiring and Pointing : An Empirical Study of Pen
Fitzmaurice , et al., “ An Exploration into Supporting Artwork Ori tilt-based Interaction ” , Proc . of the Int'l Conf. on Human Factors in
entation in the User Interface ” , Proc . of the CHI ' 99 Conf. on Computing Sys's, CHI 2011, May 7 - 12 , 2011, pp . 849 - 858 , Van
Human Factors in Computing Sys 's : The CHI is the Limit , Pitts couver, BC , Canada .
burgh , CHI 1999 , May 15 - 20 , 1999 , pp . 167-174 . Zeleznik et al., " Hands -on Math : A Page -based Multi-touch and Pen
Fitzmaurice , et al., “ Tracking Menus” , In Proceedings of the 16th Desktop for Technical Work and Problem Solving" , Proc . of the
Annual ACM Symposium on User Interface Software and Technol 23rd Annual ACM Symposium on User Interface Software and
ogy, Nov . 2 , 2003, 10 pages . Tech ., Oct. 3 -6 , 2010 , pp . 17 - 26 , New York , NY, USA .
US 10 ,Page
168,4827 B2

( 56 ) References Cited Oviatt , et al., " Toward a Theory of Organized Multimodal Integra
tion Patterns during Human -Computer Interaction ," retrieved at
OTHER PUBLICATIONS < < http :// acm .org > > , ICMI '03 Proceedings of the 5th International
Conference on Multimodal Interfaces, Nov . 2003, pp . 44 -51.
Joselli et al., “GRMOBILE - A Framework for touch and acceler Partridge, et al., “ Tilt Type: Accelerometer- Supported Text Entry for
ometer gesture recognition for mobile ” , Proceedings of the 2009 Very Small Devices," retrieved at < < http ://acm .org > > , UIST ' 02
VIII Brazilian Symposium on Games and Digital Entertainment, Proceedings of the 15th Annual ACM Symposium on User Interface
Oct. 2009 , pp . 141- 150 . Software and Technology, Oct. 2002, pp . 201- 204 .
Joshi, et al., “ Image Deblurring Using Inertial Measurement Sen “ PenLab : Itronix GoBook Duo - Touch ,” retrieved at < < http ://
sors," retrieved at < < http ://acm .org > > , ACM Transactions on Graph pencomputing .com / frames/itronix _ duotouch .html> > , retrieved on
ics, vol. 29 , No . 4 , Article 30 , Jul. 2010 , 9 pages . Jan . 31 , 2012 , Pen Computing Magazine, 3 pages .
Kendrick, “ ChromeTouch : Free Extension for Touch Tables”, GigaOM , Evernote Corp ., “ Penultimate on the App Store on iTunes ” , https://
May 6 , 2010 , 9 pages . itunes.apple .com /us/app /id354098826 ?mt= 8 , retrieved Jun . 20 , 2014 ,
Kim , et al., “ Hand Grip Pattern Recognition for Mobile User pp. 1 -3 .
Interfaces ” , In Proceedings of the 18th Conference on Innovative Premerlani, et al., “ Direction Cosine Matrix IMU : Theory ” , retrieved
Applications of Artificial Intelligence , vol. 2 , Jul. 16 , 2006 , 6 pages. from gentlenav. googlecode.com / files/DCMDraft2 .pdf, May 2009 ,
Kratz , et al., " Unravelling Seams: Improving Mobile Gesture Rec pp. 1 -30 .
ognition with Visual Feedback Techniques,” retrieved at < < http :// Rahman , et al., “ Tilt Techniques: Investigating the Dexterity of
Wrist- based Input," retrieved at < < http :// acm .org > > , CHI ' 09 Pro
acm .org > > , CHI ' 09 Proceedings of the 27th International Confer ceedings of the 27th international Conference on Human Factors in
ence on Human Factors in Computing Systems, Apr. 2009, pp. Computing Systems, Apr. 2009 , pp . 1943- 1952 .
937 - 940 . Rekimoto , J., Pick -and -drop : A direct manipulation technique for
Kurtenbach , et al., “ The design of a GUIparadigm based on tablets , multiple computer environments, Conference on Human Factors in
two -hands, and transparency ” , Proceedings of the ACM SIGCHI Computing Systems, CHI 2000, Apr. 1 , 2000 , The Hague, The
Conference on Human factors in computing systems, CHI 1997 , Netherlands.
Mar. 1997, pp . 35 -42 . Rofouei, et al., “ Your Phone or Mine ? Fusing Body, Touch and
Lester, et al., “ Are You With Me?” — Using Accelerometers to Device Sensing for Multi-User Device -Display Interaction ” , In
Determine if Two Devices are Carried by the Same Person ” , In Proceedings of the SIGCHI Conference on Human Factors in
Proceedings of Second International Conference on Pervasive Com Computing Systems, May 5 , 2012 , 4 pages .
puting, Apr. 21, 2004 , 18 pages. Roudaut, et al., “ TimeTilt: Using Sensor- Based Gestures to Travel
Liao , et al., “ PACER : Fine- grained Interactive Paper via Camera through Multiple Applications on a Mobile Device” , In Proceedings
touch Hybrid Gestures on a Cell Phone ,” retrieved at < < http :// acm . of the 12th IFIP TC 13 International Conference on Human
org > > , CHI ’ 10 Proceedings of the 28th International Conference on Computer Interaction : Part I, Aug . 24 , 2009 , 5 pages .
Human Factors in Computing Systems, Apr. 2010 , pp . 2441- 2450 . Ruiz , et al., “ User-Defined Motion Gestures for Mobile Interac
Liu , et al., “ FlexAura : A Flexible Near- Surface Range Sensor ” , 25th tion ” , In Proceedings of the SIGCHI Conference on Human Factors
Annual ACM Symposium on User Interface Software and Technol in Computing Systems, May 7 , 2011, 10 pages.
ogy , UIST ' 12 , Oct. 7 - 10 , 2012 , pp . 327- 330 , Cambridge , MA , “ Samsung Exhibit II 4G review : Second time around," retrieved at
USA < < http ://www . gsmarena. com /samsung_ exhibit _ 2 _ 4g- review -685p5.
Luff, et al., “ Mobility in Collaboration ” , Proceedings of the ACM php > > , GSMArena.com , Dec. 1 , 2011, p . 5 of online article , 3
1998 Conference on Computer Supported Cooperative Work , CSCW pages.
1998 , Nov . 14 - 18 , 1998 , pp . 305 - 314 , Seattle , WA, USA . Savov, S ., “ Samsung Galaxy S II shows offmotion -zoom option in
Mahony, et al., “ Nonlinear Complementary Filters on the Special TouchWiz 4 .0 ( video )” , Engadget, Mar. 29 , 2011, 3 pages.
Orthogonal Group ” , IEEE Trans. Automat. Contr., 2008 , pp . 1203 Schmidt, et al., “ Advanced Interaction in Context” , In Proceedings
1218 , vol . 53 , No. 5 . of the 1st International Symposium on Handheld and Ubiquitous
Malacria , et al., “ Clutch - Free Panning and Integrated Pan -Zoom Computing, Sep . 27, 1999 , 13 pages.
Control on Touch -Sensitive Surfaces : The CycloStar Approach ," Schmidt, et al., “ Phone Touch : A Technique for Direct Phone Inter
retrieved at < < http ://www .malacria. fr/data/ doc /pdf/ cyclostar.pdf > > , action on Surfaces ” , In Proceedings of the 23nd Annual ACM
Proceedings of the 28th International Conference on Human Factors Symposium on User Interface Software and Technology , Oct. 3 ,
in Computing Systems, Apr. 2010 , 10 pages . 2010 , 4 pages.
Marquardt, et al., “ Cross-Device Interaction Via Micro -Mobility Schwarz , et al., “ A Framework for Robust and Flexible Handling of
and F - formations” , 25th Annual ACM Symposium on User Inter Inputs with Uncertainty,” retrieved at < < http :// acm .org > > , UI
face Software and Technology, UIST ' 12 , Oct. 7 - 10 , 2012, pp. ST ’ 10 , Proceedings of the 23nd Annual ACM Symposium on User
13 - 22 , Cambridge , MA , USA . Interface Software and Technology, Oct. 2010 , pp . 47 -56 .
Mason , et al., “ Grip Forces When Passing an Object to a Partner” , Schwartz, et al., “ Probabilistic Palm Rejection Using Spatiotemporal
Exp . Brain Res., May 2005 , vol. 163, No. 2, pp . 173- 187 . Touch Features and Iterative Classification ” , CHI Conf. on Human
Matulic , et al., Supporting Active Reading on Pen and Touch Factors in Computing Systems, CHI 2014 , Apr. 26 -May 1 , 2014 , pp .
operated Tabletops, Proc . of the Int'l Working Conf. on Advanced 2009- 2012 , Toronto , ON , Canada .
Visual Interfaces, AVI 2012 , May 22 - 25 , 2012 , pp . 612 -619 , Capri Schwesig, et al., " Gummi: A Bendable Computer,” retrieved at
Island, Naples, Italy. < <http :// acm .org > > , CHI ' 04 , Proceedings of the SIGCHI Confer
Mohamed , et al., “ Disoriented Pen -Gestures for Identifying Users ence on Human Factors in Computing Systems, Apr. 2004, pp .
Around the Tabletop Without Cameras and Motion Sensors” , Pro 263 - 270 .
ceedings of the First IEEE International Workshop on Horizontal Sellen , et al., “ The Prevention of Mode Errors through Sensory
Interactive Human -Computer Systems ( Tabletop ’ 06 ), Jan . 2006 , 8 Feedback ," retrieved at < < http :// acm . org > > , Journal of Human
pages. Computer Interaction , vol. 7 , Issue 2 , Jun . 1992 , pp. 141 - 164 .
Moleskine SpA , “ Moleskine Journal on the App Store on iTunes” , Shanklin , “ [ Video ] New HTC Flyer Hands-on Shows Stylus' “ Palm
https:// itunes .apple.com /us/ app /moleskine -journal/id550926297 , Rejection ' in Action ” , Mar. 4 , 2011, 5 pages.
retrieved Jun . 20 , 2014 , pp . 1 - 3 . Siio et al., "Mobile Interaction Using PaperweightMetaphor” , Proc .
Mulroy, “ N - Trig Pushes Pen and Multitouch Input” , PC World , of the 19th Annual ACM Symposium on User Interface Software
retrieved on Jan . 27 , 2011 at < < http ://www .pcworld .com /article/ and Technology , UIST '06 , Oct. 2006 , pp . 111 - 114 , Montreux ,
196723 /ntrig _ pushes _ pen _ and _ multitouch _ input.html> > , May 19 , Switzerland .
2010 , 3 pages . Song et al., “Grips and Gestures on a Multi- Touch Pen ” , Proc. of the
“ N -act Multi- Touch Gesture Vocabulary Set”, retrieved date, Oct. Int 'l Conf. on Human Factors in Computing Sys ' s, CHI 2011 , May
12 , 2011, 1 page . 7 - 12 , 2011 , pp . 1323 - 1332 , Vancouver, BC , Canada.
US 10 ,Page
168,5827 B2

( 56 ) References Cited Traktovenko , Ilya , U .S . FinalOffice Action , U . S . Appl. No. 13/ 530 ,015 ,
dated Nov. 19 , 2014 , pp . 1- 48 .
OTHER PUBLICATIONS Traktovenko , Ilya , U .S . Office Action , U .S . Appl. No. 13 /530 ,015 ,
dated Apr. 28 , 2015, pp . 1 - 32 .
Song et al., “ WYSIWYF : Exploring and Annotating Volume Data Figueroa -Gibson , Gloryvid , U . S . Office Action , U . S . Appl. No.
with a Tangible Handheld Device ", Proc. of the Int'l Conf. on 12 /970 ,949 , dated Mar. 13 , 2014 , pp . 1 -29 .
Human Factors in Computing Sys 's, CHI 2011, Vancouver, BC , Figueroa -Gibson , Gloryvid , U .S . Final Office Action , U . S . Appl.
Canada , May 7 -12 , 2011. No. 12/ 970 ,949 , dated Nov. 29, 2013 , pp . 1- 24 .
Sun , et al., “ Enhancing Naturalness of Pen -and - Tablet Drawing Figueroa -Gibson , Gloryvid , U . S . Office Action , U . S . Appl. No.
through Context Sensing ” , In Proceedings of the ACM International 12 /970 , 949, dated Jun . 21 , 2013 , pp. 1 - 20 .
Conference on Interactive Tabletops and Surfaces, Nov. 13, 2011 , 4 Figueroa -Gibson , Gloryvid , U .S . Final Office Action , U .S . Appl.
pages . No. 12 /970 ,949 , dated Aug . 15 , 2014 , pp . 1 -21 .
Suzuki, et al., “ Stylus Enhancement to Enrich Interaction with Figueroa -Gibson , Gloryvid , U .S . Office Action , U . S . Appl. No.
Computers” , In Proceedings of the 12th International Conference on 12/ 970 , 949 , dated Jan . 2, 2015, pp. 1 - 24 .
Human - Computer Interaction : Interaction Platforms and Tech Figueroa -Gibson , Gloryvid , U . S . Final Office Action , U .S . Appl.
niques, Jul. 22, 2007, 10 pages .
Taylor, et al., “ Graspables: Grasp - Recognition as a User Interface” , No. 12 /970 , 949, dated Jun . 10 , 2015 , pp . 1- 25 .
In Proceedings of the 27th International Conference on Human Figueroa -Gibson , Gloryvid , U . S . Final Office Action , U .S . Appl.
Factors in Computing Systems, Apr. 4 , 2008, 9 pages . No . 12/ 970 , 943 , dated Nov . 6 , 2013 , pp . 1 - 19 .
Thurott, Paul, “ Windows XP Tablet PC Edition reviewed ” , Paul Figueroa -Gibson , Gloryvid , U . S . Office Action , U . S . Appl. No.
Thurrott 's Supersite for Windows, Jun . 25 , 2002, 7 pages. 12/970 , 943 , dated Jun . 10 , 2013 , pp . 1- 21 .
Tian , et al., “ Tilt Menu : Using the 3D Orientation Information of Figueroa -Gibson , Gloryvid , U .S . Office Action , U . S . Appl. No .
Pen Devices to Extend the Selection Capability of Pen -based User 12 /970 ,943 , dated Mar. 13 , 2014 , pp . 1 -25 .
Interfaces ” , In Proceedings of the SIGCHI Conference on Human Figueroa -Gibson , Gloryvid , U .S . Office Action . U . S . Appl. No.
Factors in Computing Systems, Apr. 5 , 2008 , 10 pages . 12/970 , 943 , dated Sep . 17 , 2014 , pp. 1- 20 .
“ TouchPaint. java ” , The Android Open Source Project, 2007 . Figueroa -Gibson , Gloryvid , U . S . Notice of Allowance, U . S . Appl.
“ Using Windows Flip 3D ” , retrieved at < < http ://windows.microsoft. No. 12 /970, 943, dated Dec . 19 , 2014 , pp . 1- 10 .
com / en -US/windows- vista /Using-Windows- Flip - 3D > > , retrieved on Traktovenko, Ilya , U . S . Notice of Allowance, U . S . Appl. No.
Feb . 9 , 2012 , Microsoft Corporation , Redmond , WA, 1 page. 12 /970 , 945 , dated Oct. 16 , 2013 , pp . 1 -7 .
Vogel, et al., “ Conte: Multimodal Input Inspired by an Artist' s Traktovenko , Ilya , U .S . Notice of Allowance , U .S . Appl. No .
Crayon ” , In Proceedings of the 24th Annual ACM Symposium on 12 /970 ,945 , dated Jul. 10 , 2013, pp . 1 - 13 .
User Interface Software and Technology , Oct. 16 , 2011 , 10 pages. Traktovenko , Ilya , U .S . Office Action , U .S . Appl. No. 12 / 970 , 945 ,
Wagner et al., “ BiTouch and BiPad : Designing Bimanual Interaction dated Apr. 22 , 2013 , pp . 1- 17 .
for Hand -held Tablets” , CHIConf. on Human Factors in Computing Figueroa -Gibson , Gloryvid , U .S . Office Action , U .S . Appl. No.
Systems, CHI ’ 12 , May 5 - 10 , 2012, pp . 2317 -2326 , Austin , TX , 12 /970, 939, dated Dec . 19 , 2013 , pp . 1 - 28 .
USA . Figueroa -Gibson , Gloryvid , U . S . Office Action , U . S . Appl. No.
Walker, Geoff, “ Palm rejection on resistive touchscreens” , Veritas et 12 /970 ,939, dated Jun . 5 , 2013, pp . 1 - 26 .
Visus , Nov . 2005, pp . 31- 33 . Figueroa -Gibson , Gloryvid , U .S . Final Office Action , U .S . Appl.
Wigdor, et al., “ Lucid - Touch : A See -through Mobile Device ," No. 12 / 970 , 939 , dated Aug . 22 , 2013 , pp . 1 - 19 .
Proceedings of the 20th Annual ACM Symposium on User Interface Figueroa -Gibson , Gloryvid , U . S . Final Office Action , U .S . Appl.
Software and Technology, Oct. 2007, pp . 269 -278 . No. 12/970 ,939 , dated May 30 , 2014 , pp . 1 -32.
Wigdor, et al., “ TiltText:Using Tilt for Text Input to Mobile Phones," Figueroa -Gibson , Gloryvid , U . S . Office Action , U .S . Appl. No .
retrieved at < < http :// acm .org > > , UIST ' 03 , Proceedings of the 16th 12 /970 ,939, dated Oct. 2 , 2014, pp . 1 -40 .
Annual ACM Symposium on User Interface Software and Technol Figueroa -Gibson , Gloryvid , U .S . Notice of Allowance, U .S . Appl.
ogy, Nov. 2003, pp. 81-90 . No. 12 / 970 , 939 , dated Dec . 19 , 2014 , pp . 1 - 10 .
“ Williamson , et al., " Shoogle : Excitatory Multimodal Interaction on Zhou , Hong, U . S . Office Action , U .S . Appl. No. 13 /026 ,058 , dated
Mobile Devices,” retrieved at < < http ://acm .org > > , CHI '07 , Pro Aug. 29 , 2013 , pp . 1 - 12 .
ceedings of the SIGCHI Conference on Human factors in Comput Zhou , Hong , U .S . Final Office Action , U . S . Appl. No. 13 /026 , 058 ,
ing Systems, Apr. 2007 , pp. 121- 124 .” dated Feb . 26 , 2014 , pp . 1 - 14 .
Wimmer et al., “ HandSense : Discriminating Different Ways of Zhou , Hong, Notice of Allowance , U . S . Appl. No . 13 /026 ,058 ,
Grasping and Holding a Tangible User Interface” , Proc. of the 3rd dated Jul. 17 , 2014 , pp . 1 -5 .
Int'l Conf. on Tangible and Embedded Interaction , TEI ' 09, Feb . Zhou , Hong , Notice of Allowance , U .S . Appl. No. 13/026 ,058 ,
2009 , pp . 359 -362 . dated Nov. 7, 2014 , pp . 1 -5 .
Xin et al.,“ Natural Use Profiles for the Pen : An Empirical Explo Treitler, Damon , U .S . Final Office Action , U . S . Appl. No. 13/ 327 ,794,
ration ofPressure , Tilt, and Azimuth ” , CHI Conf. on Human Factors dated Dec. 19 , 2013 , pp . 1 - 16 .
in Computing Sys 's, CHI ’ 12 , May 5 - 10 , 2012 , pp . 801 - 804 , Austin , Treitler, Damon , U .S . Office Action , U .S . Appl. No. 13 /327 ,794 ,
TX , USA . dated Aug . 16 , 2013 , pp . 1- 16 .
Yee, “ Two -handed interaction on a tablet display ”, Extended Abstracts Treitler, Damon , U .S . Office Action , U .S . Appl. No. 13 /327 ,794 ,
of the 2004 Conf. on Human Factors in Computing Sys 's, CHI dated Jul. 17 , 2014 , pp . 1 - 13 .
Extended Abstracts 2004 , Apr. 24 - 29 , 2004 , pp . 1493- 1496 , Vienna, Treitler, Damon , U . S . Final Office Action , U . S . Appl.No. 13 /327 ,794 ,
Austria . dated Nov . 20 , 2014 , pp . 1 - 13 .
Yoon et al., “ Text Tearing: Opening White Space for Digital Ink Davis, David D ., U .S . Office Action , U .S . Appl. No. 13 /327 ,794 ,
Annotation ” , The 26th Annual ACM Symposium on User Interface dated Mar. 11, 2016 , pp . 1 - 15 .
Software and Tech ., UIST ’ 13 , St. Andrews, United Kingdom , Oct. Davis, David D ., U . S . Final Office Action , U . S . Appl. No. 13/327,794 ,
8 - 11 , 2013 , pp . 107 - 112 . dated Nov . 8 , 2016 , pp . 1- 16 .
Zhou, Hong, U .S . Office Action, U .S . Appl. No. 13 /903, 944 , dated Geisy, Adam , U . S . Office Action , U . S . Appl. No. 13 / 367,377 , dated
Mar. 27 , 2015 , pp . 1 -23 . Feb . 13 , 2014 , pp . 1 -11 .
Zhou , Hong, U .S . Notice of Allowance, U .S . Appl. No. 13 /903, 944 , Geisy, Adam , U .S . Final Office Action , U .S . Appl. No. 13 / 367, 377 ,
dated Jul. 20 , 2015 , pp . 1- 5 . dated Jul. 1, 2014 , pp . 1- 12 .
Zhou , Hong, U .S . Notice of Allowance , U .S . Appl. No. 13 /903 ,944, Geisy, Adam , Notice of Allowance, U .S . Appl. No . 13 /367,377,
dated Aug. 3 , 2015, pp. 1 -2 . dated Oct. 27 , 2014 , pp . 1 - 10 .
Traktovenko , Ilya , U . S . Office Action , U .S . Appl. No . 13 / 530 ,015 , Pervan , Michael, U .S . Office Action , U . S . Appl. No. 14 /303,203 ,
dated Jul. 18 , 2014 , pp . 1- 26 . dated Mar. 1, 2016 , pp. 1 - 11.
US 10 ,Page
168,6827 B2

( 56 ) References Cited the Twenty -Sixth Annual SIGCHIConference on Human Factors in


Computing Systems, Apr. 2008 , pp . 1779 - 1788 .
OTHER PUBLICATIONS Cheng, et al., iGrasp : Grasp - Based Adaptive Keyboard for Mobile
Devices, 2013 ACM SIGCHI Conf. on Human Factors in Comput
Pervan , Michael, U . S . Final Office Action , U . S. Appl. No . 14 /303,203, ing Systems, CHI Extended Abstracts 2013, Apr. 27 -May 2 , 2013 ,
dated Jul. 26 , 2016 , pp. 1- 7 . pp . 2791 -2792 , Paris , France .
Pervan , Michael, U . S . Office Action , U . S . Appl. No. 14 /303,203 , Cheng, et al., " iRotateGrasp : Automatic Screen Rotation Based on
dated Nov. 2, 2016 , pp . 1- 7 . Grasp of Mobile Devices” , 2013 ACM SIGCHI Conf. on Human
Pervan ,Michael, U . S . Notice of Allowance , U . S . Appl. No . 14 /303, 203 , Factors in Computing Systems, CHI Extended Abstracts 2013 , Apr.
dated May 1 , 2017 , pp . 1-7 . 27 -May 2 , 2013 , pp . 2789 - 2790, Paris , France .
McLoone, Peter D ., U . S . Office Action , U . S . Appl. No . 14 /303,234 , Cho, et al., “ Multi- Context Photo Browsing on Mobile Devices
dated Oct. 15 , 2015 , pp. 1 - 16 . Based on Tilt Dynamics,” retrieved at < < http ://acm . org > > ,MobileHCI
McLoone, Peter D ., U .S . Final Office Action , U .S . Appl. No. '07 Proceedings of the 9th International Conference on Human
14 / 303 , 234 , dated Mar. 28 , 2016 , pp . 1 - 16 . Computer Interaction with Mobile Devices and Services, Sep . 2007,
McLoone, Peter D ., U . S . Office Action , U .S . Appl. No. 14 /303 ,234 , pp. 190 - 197.
dated Jun . 16 , 2016 , pp . 1 - 17 . Li et al., “ Virtual Shelves : Interactions with Orientation - Aware
McLoone, Peter D ., U . S . Office Action , U .S . Appl. No. 14 /303,234 , Devices," retrieved at < < http ://acm .org > > , UIST'09, Oct. 2009, pp.
125 - 128 .
dated Dec. 8 , 2016 , pp. 1 - 16 . Cohen , et al., “ Synergistic Use of Direct Manipulation and Natural
McLoone, Peter D ., U .S . Office Action , U .S . Appl. No. 14 / 303 ,234 , Language ,” retrieved at < < http :// acm .org > > , CHI '89 Proceedings
dated Mar. 21 , 2017, pp . 1 - 13. of the SIGCHI Conference on Human Factors in Computing Sys
Seifert, J., Written opinion of the International Preliminary Exam tems, Apr. 1989 , pp . 227 - 233 .
ining Authority, PCT/US2015 /034613, dated Apr. 20 , 2016 , pp . Dachselt, et al., “ Throw and Tilt Seamless Interaction Across
1 - 10 . Devices Using Mobile Phone Gestures” , Proceedings of the 34th
Ernst, M ., Written opinion of the International Preliminary Exam Graphics Interface Conference, May 2008 , 7 pages.
ining Authority, PCT/US2015 /034612 , dated May 18 , 2016 , pp . 1 -5 . Döring, et al., “ Exploring Gesture - Based Interaction Techniques in
Annett, M ., F. Anderson , W . F. Bischof, A . Gupta , The pen is Multi-Display Environments with Mobile Phones and a Multi
mightier: Understanding stylus behaviour while inking on tablets , Touch Table” , Proceedings of the Workshop on Coupled Display
Graphics Interface 2014 , GI ’ 14 , May 7 - 9 , 2014 , pp . 193 - 200 , Visual Interfaces, May 25 , 2010 , pp . 47-54.
Montreal, QC , Canada . “ DuoSense Pen , Touch & Multi - Touch Digitizer," retrieved at
Ashbrook, et al., “ Magic : A Motion Gesture Design Tool,” retrieved < < http ://www .n - trig .com /Data /Uploads/Misc/DuoSense _ Brochure _
at < < http ://research .nokia.com / files/ 2010 -Ashbrook -CHI10 -MAGIC . FINAL .pdf> > , May 2008, N -trig Ltd ., Kfar Saba, Israel, 4 pages .
pdf > > , Proceedings of the 28th International Conference on Human Edge, et al., “ Bimanual Tangible Interaction with Mobile Phones,"
Factors in Computing Systems, Apr. 2010 , 10 pages . retrieved at < < http ://research .microsoft.com / en -us /people /daedge/
Babyak , Richard , “ Controls & Sensors : Touch Tones” , retrieved at edgeteibimanual2009. pdf > > , Proceedings of the 3rd International
< < http ://www .appliancedesign .com / Articles/Controls _ and _ Displays/ Conference on Tangible and Embedded Interaction , Feb . 2009, pp .
BNP _ GUID _ 9 -5 - 2006 _ A _ 10000000000000129366 > > , Appliance 131- 136 .
Design , Jun . 30 , 2007, 5 pages . Eslambolchilar, et al., “ Tilt- Based Automatic Zooming and Scaling
Bao , et al., “ Effect of Tilt Angle of Tablet on Pen -based Input in Mobile Devices a state -space implementation .” retrieved at
Operation Based on Fitts ' Law ” , Proceedings of the 2010 IEEE < <http ://www .des. gla .ac .uk / ~ rod/publications/ EslMur04 -SDAZ .
International Conference on Information and Automation , Jun . pdf > > , Proceedings of Mobile HC12004 : 6th International Confer
2010 , pp . 99 - 104. ence on Human Computer Interaction with Mobile Devices , Springer,
Bartlett, Joel F ., “ Rock ' n ' Scroll is Here to Stay ,” accessed at Sep . 2004 , 12 pages
< < http ://www .hpl.hp .com /techreports /Compaq -DEC /WRL - 2000 - 3 . Essl, et al., “ Use the Force (or something) Pressure and Pressure
pdf> > , Western Research Laboratory, Palo Alto , California , May Like Input for Mobile Music Performance ," retrieved at < < http ://
2000 , 9 pages . www .deutsche-telekom - laboratories .de/ - rohs /papers/Essl- ForceMusic .
Becchio , et al., “ Grasping Intentions: From Thought Experiments to pdf> > , NIME 2010 Conference on New Interfaces for Musical
Empirical Evidence” , Frontiers in Human Neuroscience ,May 2012 , Expression, Jun. 2010 , 4 pages.
vol. 6 , pp . 1 -6 . Fiftythree Inc., " A Closer Look at Zoom " , May 21, 2013 , pp . 1 - 10 .
Bi, et al., “ An Exploration of Pen Rolling for Pen -Based Interac Fiftythree Inc ., “ Pencil ” , Nov. 19 , 2013 , pp . 1 - 12 .
tion ” , In Proceedings of the 21st Annual ACM Symposium on User Frisch , et al., “ Investigating Multi- Touch and Pen Gestures for
Interface Software and Technology , Oct. 19 , 2008 , 10 pages. Diagram Editing on Interactive Surfaces” , In ACM International
Bjørneseth , et al., “Dynamic Positioning Systems— Usability and Conference on Interactive Tabletops and Surfaces, Nov . 23 , 2009 , 8
Interaction Styles," retrieved at < <http ://www .ceng.metu .edu . pages.
tr/~ tcan se705_ s0809 /Schedule/ assignment3.pdf > > , Proceedings of Goel, et al., “ GripSense: Using Built-In Sensors to Detect Hand
the 5th Nordic Conference on Human - Computer Interaction : Build Posture and Pressure on Commodity Mobile Phones” , In Proceed
ing Bridges, Oct. 2008, 10 pages . ings of the 25th Annual ACM Symposium on User interface
Brandl, et al., " Occlusion -aware menu design for digital tabletops” , Software and Technology, Oct. 7 , 2012 , 10 pages.
Proc . of the 27th Int'l Conf. on Human Factors in Computing Guiard , et al., “ Writing Postures in Left -Handers : Inverters are
Systems, CHI 2009 , Extended Abstracts, Apr. 4 - 9 , 2009 , pp . 3223 Hand- Crossers” , Neuropsychologia, Mar. 1984 , pp. 535 -538 , vol.
3228 , Boston , MA, USA . 22 , No. 4 .
Buxton , William , “ Chunking and Phrasing and the Design of Harrison, et al.," Scratch Input: Creating Large, Inexpensive, Unpowered
Human -Computer Dialogues," retrieved at < < http ://www .billbuxton . and Mobile Finger Input Surfaces,” retrieved at < < http ://acm .org > > ,
com /chunking.pdf > > , Proceedings of the IFIP World Computer UIST '08 Proceedings of the 21st Annual ACM Symposium on User
Congress, Sep . 1986 , 9 pages. interface Software and Technology, Oct. 2008, pp . 205 -208 .
Buxton , William , “ Integrating the Periphery and Context: A New Harrison , et al., " Skinput: Appropriating the Body as an Input
Model of Telematics Proceedings of Graphics Interface” , 1995 , pp . Surface,” retrieved at < <http ://acm .org > > , CHI ’ 10 Proceedings of
239 -246 . the 28th International Conference on Human Factors in Computing
Buxton , William , “ Lexical and Pragmatic Considerations of Input Systems, Apr. 2010 , pp . 453 -462.
Structure," retrieved at < <http ://acm .org > > , ACM SIGGRAPH Com Harrison , et al., “ Squeeze Me, Hold Me, Tilt Me! An Exploration of
puter Graphics, vol. 17 , Issue 1, Jan . 1983 , pp . 31 - 37 . Manipulative User Interfaces” , In Proceedings of the SIGCHI
Chen , et al., “ Navigation Techniques for Dual- Display E - Book Conference on Human Factors in Computing Systems, Apr. 18 ,
Readers,” retrieved at < < http ://acm .org > > , CHI ' 08 Proceeding of 1998 , 8 pages.
US 10 ,Page
168,7827 B2

( 56 ) References Cited Hinckley , et al., “ Pen + Touch = New Tools” , In Proceedings of the
23nd Annual ACM Symposium on User Interface Software and
OTHER PUBLICATIONS Technology, Oct. 3 , 2010 , 10 pages .
Hinckley , et al., “ Sensing Techniques for Mobile Interaction ” , In
Hasan , et al., “ A -Coord Input: Coordinating Auxiliary Input Streams Proceedings of the 13th Annual ACM Symposium on User Interface
for Augmenting Contextual Pen -Based Interactions” , In Proceed Software and Technology, Nov. 5 , 2000, 10 pages.
ings of the SIGCHI Conference on Human Factors in Computing Hinckley , et al., “ Sensor Synaesthesia : Touch in Motion , and
Systems, May 5 , 2012 , 10 pages. Motion in Touch ” , In Proceedings of the SIGCHI Conference on
Hassan , et al., “ Chucking: A One-Handed Document Sharing Tech Human Factors in Computing Systems, May 7 , 2011, 10 pages.
nique ,” T. Gross et al. (Eds.): Interact 2009, Part II, LNCS 5727 , Hinckley, Ken , “ Synchronous Gestures for Multiple Persons and
Aug . 2009, pp . 264 - 278 . Computers” , In Proceedings of the 16th Annual ACM Symposium
Herot, et al., “ One -Point Touch Input of Vector Information from on User Interface Software and Technology , Nov . 2 , 2003 , 10 pages .
Computer Displays," retrieved at < < http :// acm . org > > , SIGGRA Holmquist, et al., “ Smart- Its Friends: A Technique for Users to
PH '78 Proceedings of the 5th Annual Conference on Computer Easily Establish Connectionsbetween Smart Artefacts” , In Proceed
Graphics and Interactive Techniques, 12( 3 ), Aug . 1978 , pp . 210 ings of the 3rd International Conference on Ubiquitous Computing,
216 . Sep . 30 , 2001, 6 pages.
Hinckley , et al., “ Codex : A dual screen tablet computer" , Proc . of Hudson , et al., “ Whack Gestures : Inexact and Inattentive Interaction
the 27th Int 'l Conf. on Human Factors in Computing Sys ' s, CHI with Mobile Devices” , In Proceedings of the Fourth International
2009 , Apr. 4 - 9 , 2009, pp . 1933 -1942, Boston , MA, USA . Conference on Tangible, Embedded , and Embodied Interaction , Jan .
Hinckley , et al., “ Design and Analysis of Delimiters for Selection 25 , 2010 , 4 pages.
Action Pen Gesture Phrases in Scriboli,” retrieved at < < http ://acm . Ion , F ., Getting started with Andriod 's use profiles, Mar. 18 , 2014 ,
org > > , CHI ’ 05 Proceedings of the SIGCHIConference on Human IDG Consumer, pp . 1 -7 .
Factors in Computing Systems, Apr. 2005 , pp . 451-460 . Iwasaki, et al., " Expressive Typing : A New Way to Sense Typing
Hinckley, et al., “ Direct Display Interaction via Simultaneous Pen Pressure and Its Applications,” retrieved at < < http ://acm .org > > ,
+ Multi -touch Input” , In Society for Information Display ( SID ) CHI '09 Proceedings of the 27th International Conference Extended
Symposium Digest of Technical Papers, May , 2010 , 4 pages . Abstracts on Human Factors in Computing Systems, Apr. 2009 , pp .
Hinckley , et al., “ Foreground and Background Interaction with 4369 - 4374 .
Sensor-Enhanced Mobile Devices,” retrieved at < < http ://research . Izadi, et al., “ C - Slate : A Multi - Touch and Object Recognition
microsoft .com / en -us/um /people/kenh /papers/tochisensing.pdf> > , ACM System for Remote Collaboration using Horizonta Surfaces” , Sec
Transactions on Computer-Human Interaction , vol. 12 , No . 1 , Mar. ond Annual IEEE International Workshop on Horizontal Interactive
2005, 22 pages . Human -Computer System , Oct. 2007 , pp . 3 - 10 .
Hinckley , et al., “ Manual Deskterity : An Exploration of Simulta “ International Preliminary Report on Patentability Issued in PCT
neous Pen + Touch Direct Input,” retrieved at < < http :// acm .org > > , Application No. PCT/US2015/034612 ” , dated Aug. 29 , 2016 , 8
CHI EA ' 10 Proceedings of the 28th of the International Confer pages.
ence , Extended Abstracts on Human Factors in Computing Systems, “ Supplemental Notice of Allowance Issued in U .S . Appl. No.
Apr. 2010 , pp . 2793 -2802. 14 / 303 , 203” , dated Jun . 12 , 2017 , 2 pages .
Hinckley, Paper: Motion and Context Sensing Techniques for Pen “ International Search Report & Written Opinion Issued in PCT
Computing , http://kenhinckley.wordpress. com /, Jul. 31, 2013, pp . Application No. PCT/US2015 /034612” , dated Sep . 4 , 2015 , 12
1 -3. pages.
U.S.Putene smo se sve us acasa
U . S . Patent Jan . 1, 2019 Sheet 1 of 14 US 10 ,168,827 B2

Core Pen Grips and Poses 100


Ready to Act 108 Half Supination 110 Full Supination 112

102GWritipnsg Les

A LA
pomy

eller
Moto

104TuckGrips th tett
wareligt
Home

106
Grips
Palm sor

Monaste
WON
S

FIG . 1
A Bin a we was
U . S . Patent Jan. 1 , 2019 Sheet 2 of 14 US 10,168,827 B2

:
Single Finger Extension Grips for
Touchscreen Manipulation 200

Index Middle Ring 206 Thumb

Tuck202 ?9?

?y?
"

,
??

/
1
A3
%
45

Palm204
/

"
?
,
?????

FIG.2
FIG . 2
U . S . Patent Jan . 1, 2019 Sheet 3 of 14 US 10 ,168,827 B2

Pinch Index + Middle : 4 or 5 Fingers Thumb on Tablet

44
jFTER "

.
1993
AWAS

Tuck *
*
.
.

hat
BUWA
Palm
kwa

FIG . 3
Other Pen Grips 400
Finger Lift External Precision 402 Passing Grip Thumb Slide 404

PSPPY e

ing

FIG . 4
U . S . Patent Jan . 1, 2019 Sheet 4 of 14 US 10 ,168,827 B2

Sensor Module 508


Sensor Monitor readings of one or more
500 4 Pen | sensors coupled to sensor pen

502
Communications Communication Module 510
Link
Send sensor readings to
computing device
504 506

Touch -Sensitive
Computing Device

Sensor Pen InputModule 512 Computing Device Touch Input Module 514
Receive input from one or more sensors Receive input from one or more touch sensitive
of sensor pen . surfaces of computing device

Grip Database Grip and Touch Determination Module 516


518 Determines the grip/touch patterns on the touch sensitive computing
Known Grips device and the touch patterns on the sensor pen . Correlates these
with other sensor input if desired .

Context Determination Module 520 Meta


Correlates the grip patterns on the sensor pen with the grip and touch patterns on displays or other Data :
touch sensitive surfaces of computing device, and possibly other sensor data , to determine context Labeler
of use and user intent. Determines the preferred hand /non -preferred hand of the user. 524

Command Initiation Module 522


Generates appropriate commands based \ the determined context and user intent.

??????
Palm : Advanced
Rejection :: :: Drafting
526 528
Tools
Magnifier ; Pan / -- - Thumb

530
Tool
532
- -
| Loupe Zoom Contact Menu Tools
Rejection
544
544

FIG . 5
- - - - Pen ; Canvas Drafting ! User
thumni

534
536
"
Tools Tools , Defined or
538 540
542
Other
U . S . Patent Jan . 1, 2019 Sheet 5 of 14 US 10 ,168 ,827 B2

- 600
START

RECEIVE A COMBINATION OF CONCURRENT SENSOR INPUTS ,602


BASED ON GRIP PATTERNS ON A TOUCH -SENSITIVE PEN AND
TOUCH PATTERNS ON A TOUCH -SENSITIVE COMPUTING
DEVICE.

USE THE SENSOR INPUTS TO DETERMINE HOW THE


TOUCH - SENSITIVE PEN IS BEING GRIPPED CONCURRENT
WITH HOW THE TOUCH -SENSITIVE COMPUTING DEVICE IS 604
BEING TOUCHED (E .G ., TYPE GRIP OR TOUCH , PREFERRED
OR NON -PREFERRED HAND , ORIENTATION OF THE DEVICE ).

INITIATE A USER INTERFACE ACTION BASED ON A


COMBINATION OF THE GRIP PATTERN ON THE TOUCH 606
SENSITIVE PEN AND THE TOUCH PATTERN ON THE TOUCH
SENSITIVE COMPUTING DEVICE .

( END

FIG . 6
U . S . Patent Jan . 1, 2019 Sheet 6 of 14 US 10 ,168 ,827 B2

START 700

CORRELATE THE SIGNALS OF CONTACTS ON TWO OR MORE


TOUCH -SENSITIVE DEVICES BY A SINGLE USER . 702

DETERMINE THE CONTEXT OF THE CONTACTS ON THE 704


TWO OR MORE TOUCH - SENSITIVE DEVICES BASED ON THE
CORRELATED SIGNALS .

706
USE THE DETERMINED CONTEXT AS METADATA FOR
USE IN AN APPLICATION .

708
USE THE DETERMINED CONTEXT TO INITIATE A
CONTEXT-APPROPRIATE USER INTERFACE ACTION .

END

FIG . 7
U . S . Patent Jan . 1, 2019 Sheet 7 of 14 US 10 ,168,827 B2

START - 800

SENSE WHEN A POINTING DEVICE COMES WITHIN A


PREDETERMINED RANGE OF A TOUCH -SENSITIVE SURFACE 802
OF A COMPUTING DEVICE .

BEGIN CONTROL OF AN INPUT ON A DISPLAY SCREEN


OF THE COMPUTING DEVICE USING MOTIONS OF THE
POINTING DEVICE WHEN THE POINTING DEVICE IS WITHIN
804
THE PREDETERMINED RANGE .
. ........ . . ..

806
CONTINUE CONTROL OF THE INPUT ON THE DISPLAY
OF THE COMPUTING DEVICE USING MOTIONS OF THE
COMPUTING DEVICE WHEN THE POINTING DEVICE IS
OUTSIDE OF THE PREDETERMINED RANGE .

( END )

FIG . 8
U . S . Patent Jan . 1, 2019 Sheet 8 of 14 US 10 ,168,827 B2

START - 900

SENSE THE GRIP OF A PRIMARY USER ON A TOUCH


SENSITIVE COMPUTING DEVICE . 902

CONCURRENTLY SENSE THE GRIP OF A SECONDARY 904


USER ON A TOUCH - SENSITIVE COMPUTING DEVICE .
.. . . . . . . .

CORRELATE THE GRIPS OF THE PRIMARY AND 906


SECONDARY USER TO DETERMINE THE CONTEXT OF THE
GRIPS .

USE THE CONTEXT OF THE GRIPS TO INITIATE A


COMMAND OF AN APPLICATION EXECUTING ON THE TOUCH
908
SENSITIVE COMPUTING DEVICE .

END )
FIG . 9
CONCURRENTLY RECEIVE SENSORINPUT
U . S . Patent Jan . 1, 2019 Sheet 10 of 14 US 10 , 168,827 B2

Magnifier/Loupe Tool 1100

2 - F Pinch
Tucked Grip

FIG . 11
Full- canvas Pan /Zoom 1200

w wwwww

1 - Finger
2 Pinch NPH Tucked Grip Palmed Grip No Pen Grip

FIG . 12
U . S . Patent Jan . 1, 2019 Sheet 11 of 14 US 10 , 168,827 B2

Drafting tools appears when the pen is ready to write


DRAFTING TOOLS 1300

RiwvN

WWW. WS

2F Pinch NPH Writing Grip

FIG . 13
A tap when pen is tucked brings up the Pen Tools menu
PEN TOOLS 1400
oooon

1 -Finger Finger Tap


Tucked Grip

FIG . 14
U . S . Patent Jan . 1, 2019 Sheet 12 of 14 US 10 , 168,827 B2

Canvas tool 1600 (when pen not ready to write)


__

CANVAS TOOLS
200000

.. . .

1-Finger NPH
NOT (VL )
Writing Grip

FIG . 15
+ 1606

1604

1602

Baton Grip Level

FIG 16
U . S . Patent Jan . 1, 2019 Sheet 13 of 14 US 10, 168 ,827 B2

1704

Level orientation of the


touch -sensitive computing
device

Left handed grip of a first Righthanded grip of a


user 1702 second user 1706

FIG . 17
atent Jan . 1, 2019 Sheet 14 of 14 US 10, 168 ,827 B2

Simplified Computing Device


1825 - 1860

Processing System Sensors 1870


Unit ( s ) Removable
Storage
T 1810
Non -Removable
1820 Storage
m 1880
System Memory GPU (S ) Display Device( s)
Storage Devices
1815 1855
t . . = * . . . . + . . . + . . . + + + + + + + + + + + + + + + + + + + + + + + + + r e

* - - - - - -

1805 1850 1840 1830


Touch -Sensitive Surface (s ) Output Input Comms.
( e .g ., One ormore Displays, Cover, Device( s ) Device(s ) Interface
Case , Frame, Bezel, etc .)

1800

1835 Logic (Memory , 1/ 0 (Wired or


Pen Sensors Processor, etc .) Wireless )
1845 1865 J 1885 )

1875 Power
Simplified Sensor Pen Device L Source

FIG . 18
US 10 , 168 ,827 B2
SENSOR CORRELATION FOR PEN AND Furthermore , some embodiments of the pen and comput
TOUCH -SENSITIVE COMPUTING DEVICE ing device sensor correlation technique can be used to find
INTERACTION meta information to semantically label the context of the
sensed grips or touches . For example , some pen and com
CROSS -REFERENCE TO RELATED 5 puting device sensor correlation technique embodiments
APPLICATIONS correlate the received signals of the contacts by a user on
two or more touch -sensitive devices and determine the
This application is a Continuation application of U .S . context of the contacts based on the correlation of the
patent application Ser. No . 14 /303, 203 , filed on Jun . 12 , signals. The determined context of the contacts is labeled as
2014 by Hinckley, et al., and entitled “ SENSOR CORRE- 10 metadata for use in an application . For example , this context
LATION FOR PEN AND TOUCH - SENSITIVE COMPUT can be which hand the user is holding a device in , how the
ING DEVICE INTERACTION ," and claims priority to U . S . user is holding the device , how many users are sharing a
patent application Ser. No. 14 /303,203 . device , and so forth . The derived metadata can be used to
BACKGROUND label any type of input and can be used for other purposes.
The context metadata also can be used to initiate a context
Many mobile computing devices (e.g ., tablets, phones, appropriate user interface action .
Many , many other capabilities that exploit the natural
etc . ), as well as other devices such as desktop digitizers ,
drafting boards , tabletops , e -readers , electronic whiteboards ways a user or users hold and touch a touch - sensitive pen
and other large displays , use a pen , pointer, or pen type input 20 and /or a touch -sensitive computing device in order to pro
device in combination with a digitizer component of the vide the user with context- specific tools are possible .
computing device for input purposes. Many of these com
puting devices have touch - sensitive screens and interact BRIEF DESCRIPTION OF THE DRAWINGS
with pen and with bare -handed touch or with the two in
combination . 25 The specific features , aspects, and advantages of the
claimed subject matter will become better understood with
SUMMARY regard to the following description , appended claims, and
accompanying drawings where :
This Summary is provided to introduce a selection of FIG . 1 depicts exemplary naturally -occurring core pen
concepts in a simplified form that are further described 30 grips and poses .
below in the Detailed Description . This Summary is not FIG . 2 depicts exemplary naturally - occurring single fin
intended to identify key features or essential features of the ger extension grips for touch screen manipulation .
claimed subjectmatter, nor is it intended to be used as an aid , FIG . 3 depicts exemplary naturally -occurring multiple
in determining the scope of the claimed subject matter. finger extension grips for touch screen manipulation .
In general, embodiments of a pen and computing device 35 FIG . 4 depicts other types of naturally - occurring pen
sensor correlation technique as described herein correlate grips .
sensor signals received from various grips on a touch FIG . 5 provides an exemplary system that illustrates
sensitive pen (e.g., also called a pen , sensor pen or touch - program modules for implementing various implementa
sensitive stylus herein ) and touches to , or grips on , a tions of the pen and computing device sensor correlation
touch -sensitive computing device (for example , a touch - 40 technique, as described herein .
sensitive tablet computing device ) in order to determine the FIG . 6 provides an exemplary flow diagram of using the
context of such grips and touches and to issue context pen and computing device sensor correlation technique to
appropriate commands to the touch -sensitive pen and / or the provide a correlated touch and sensor pen inputmechanism ,
touch -sensitive computing device . as described herein .
In various embodiments of the pen and computing device 45 FIG . 7 provides an exemplary flow diagram of using the
sensor correlation technique a combination of concurrent pen and computing device sensor correlation technique to
sensor inputs received from both a touch - sensitive pen and providemetadata based on correlated signals received due to
a touch - sensitive computing device are correlated . How the contacts on two or more touch - sensitive devices .
touch - sensitive pen is gripped and the how the touch - FIG . 8 provides an exemplary flow diagram of using a
sensitive computing device is touched or gripped are used to 50 pointing device to continue control of an input on a display
determine the context of their use and the user ' s intent. A screen of a touch -sensitive computing device .
user interface action based on these grip and touch inputs FIG . 9 provides an exemplary flow diagram of passing a
can then be initiated . touch -sensitive computing device from a primary user to a
In some embodiments of the pen and computing device secondary user.
sensor correlation technique , pen and computing device 55 FIG . 10 provides an exemplary flow diagram of passing
motions can also be correlated with the concurrent touch and both a touch - sensitive computing device and a touch - sensi
grip sensor inputs to determine whether either device is held tive pen from a primary user to a secondary user.
in the user' s preferred hand or the user ' s non -preferred hand , FIG . 11 provides an exemplary illustration of using the
to determine the context inferred, and to initiate context- pen and computing device sensor correlation technique to
appropriate commands and capabilities. For example , the 60 provide a magnifier/ loupe tool input mechanism based on
determined context can be used to initiate commands to the pen being held in a user ' s preferred hand in a tucked grip ,
suppress accidental content rotation on a display, to pan or as described herein .
zoom content displayed on a display, to display a menu of FIG . 12 provides an exemplary illustration of using the
pen specific tools, to display a menu of canvas tools or pen and computing device sensor correlation technique to
drafting tools , or to display a magnifier/ loupe tool that 65 provide a full-canvas pan /zoom input mechanism based on
magnifies content on the display of the computing device, the actions of a user 's non -preferred hand, as described
among others . herein .
US 10 , 168 ,827 B2
FIG . 13 provides an exemplary illustration of using the grip sensing, and multi-touch inputs when they are distrib
pen and computing device sensor correlation technique to u ted across separate pen and touch - sensitive computing
provide a drafting tool input mechanism based on the touch (e .g., tablet) devices.
and grip patterns of both the user 's preferred hand and the 1. 2 Grips and Sensing for Tablets
user ' s non - preferred hand , as described herein . 5 Lightweight computing devices such as tablets afford
FIG . 14 provides an exemplary illustration of using the many new grips , movements , and sensing techniques .
pen and computing device sensor correlation technique to Implementations of the pen and computing device sensor
provide a pen tool input mechanism based on the touch and correlation technique described herein are the first to imple
grip patterns ofboth the user 's preferred hand and the user 's ment full grip sensing and motion sensing on both tablet
non -preferred hand , as described herein . 10 and pen at the same time for sensing pen touch interac
FIG . 15 provides an exemplary illustration of using the tions. Note that “ grip ” may be recognized by the system as
pen and computing device sensor correlation technique to a holistic combination of a particular hand -contact pattern
provide a canvas tool input mechanism based on the touch that takes into account the 3D orientation or movement of
and grip patterns of both the user ' s preferred hand and the the implement or device as well; that is , no clear line can be
user 's non -preferred hand, as described herein . 15 drawn between touch -sensitive grip -sensing and inertial
FIG . 16 provides an exemplary illustration of two users motion -sensing, per se , since all these degrees of freedom
passing a touch -sensitive pen between them and initiating may be employed by a recognition procedure to classify the
context- appropriate capabilities based on the grip patterns of currently observed “ grip ” as accurately as possible . Thus ,
both users on the touch - sensitive pen and the orientation of whenever the term “ grip ” is used the possible combination
the pen , as described herein . 20 of touch with motion or orientation degrees -of- freedom is
FIG . 17 provides an exemplary illustration of two users implied .
passing or sharing a touch - sensitive computing device 1.3 Palm Detection and Unintentional Touch Handling
between them based on the grip patterns of the two users on Palm contact can cause significant false -activation prob
the touch -sensitive computing device and the orientation of lems during pen + touch interaction . For example , somenote
the touch - sensitive computing device , as described herein . 25 taking applications include palm -blocking but appear to rely
FIG . 18 is a general system diagram depicting a simplified on application -specific assumptions about how and where
general -purpose computing device having simplified com the user will write . Some palm -rejection techniques require
puting and I/O capabilities, in combination with a touch the user to bring the pen tip on or near the screen before
sensitive pen having various sensors , power and communi- setting the palm down , which requires users to modify their
cations capabilities, for use in implementing various 30 natural movements. Implementations of the pen and com
implementations of the pen and computing device sensor puting device sensor correlation technique use sensors to
correlation technique, as described herein . detect when a touch screen contact is associated with the
hand holding the pen .
DETAILED DESCRIPTION 1 .4 Sensor-Augmented and Multi -DOF Pen Input
35 Auxiliary tilt , roll , and other pen degrees -of- freedom can
In the following description of the implementations of the be combined to call up menus or trigger mode switches
claimed subject matter, reference is made to the accompa - without necessarily disrupting natural use . Implementations
nying drawings , which form a part hereof, and in which is of the pen and computing device sensor correlation tech
shown by way of illustration specific implementations in nique implement capabilities where the user can extend one
which the claimed subject mattermay be practiced . It should 40 or more fingers while tucking the pen . The pen and com
be understood that other implementations may be utilized puting device sensor correlation technique implementations
and structural changes may bemade without departing from can sense these contacts as distinct contexts with separate
the scope of the presently claimed subject matter. functions, even if the user holds the pen well away from the
1.0 Introduction screen .
The following paragraphs provide an introduction to 45 Pens can be augmented with motion , grip , and near
mobile sensing, sensor-augmented pens, grip sensing, and surface range sensing. One type of pen uses grip sensing to
pen + touch input on touch -sensitive computing devices . detect a tripod writing grip , or to invoke different types of
1. 1 Mobile Sensing on Handheld Computing Devices brushes. Other systemsuse an integrated IMU on the pen as
Tilt, pressure , and proximity sensing on mobile devices a feature to assist grip recognition and sense the orientation
enables contextual adaptations such as detecting handed - 50 of associated computing device / tablet ( e . g . for horizontal vs .
ness, portrait/ landscape detection , or walking versus station - drafting table use ) to help provide appropriate sketching
ary usage . Grip sensing allows a mobile device to detect how aids . The pen and computing device sensor correlation
the user holds it , or to use grasp to automatically engage technique implementations described herein go beyond
functions such as placing a call, taking a picture, or watching these efforts by exploring sensed pen grips and motion in
a video . Implementations of the pen and computing device 55 combination with pen + touch gestures , and also by extending
sensor correlation technique described herein adopt the grip sensing to the tablet itself .
perspective of sensing natural user behavior, and applying it 2.0 Natural Pen and Tablet User Behaviors
to single or multiple touch -sensitive pen and touch -sensitive Implementations of the pen and computing device sensor
computing device ( e.g ., tablet) interactions . correlation technique described herein use natural pen and
Multi-touch input and inertial sensors ( Inertial Measure - 60 touch - sensitive computing device (e . g ., tablet ) user behav
mentUnits ( IMU ' s) with 3 - axis gyroscopes, accelerometers , iors to determine the context associated with these behaviors
and magnetometers ) afford new possibilities for mobile in order to provide users with context- appropriate tools . As
devices to discern user intent based on grasp and motion such , some common grips that arise during digital pen - and
dynamics . Furthermore , other sensors may track the position tablet tasks , and particularly touch screen interactions articu
of these mobile devices . Implementations of the pen and 65 lated while the pen is in hand are useful to review and are
computing device sensor correlation technique illustrate enumerated below and shown in FIGS. 1, 2 , 3 and 4 . A wide
new techniques that leverage these types ofmotion sensing, variety of behaviors ( listed as B1-B11 below ) have been
US 10 , 168,827 B2
observed and were used in designing various implementa - incidental contact with the screen . Another form of touch
tions of the pen and computing device sensor correlation screen avoidance is perching the thumb along the outside
technique. The following paragraphs focus on behaviors of rim of touch - sensitive computing device ( e .g ., the tablet
right -handers ; left-handers are known to exhibit a variety of bezel), rather than letting it stray too close to the touch
additional grips and accommodations. It should be noted 5 screen when picking up the touch -sensitive computing
that the behaviors discussed below and shown in FIGS . 1 , 2 , device . These unnatural and potentially fatiguing accommo
3 and 4 are only exemplary in nature and other behaviors are dations reflect a system 's inability to distinguish the context
entirely possible . of intentional versus unintentional touch .
2 .1 Behavior B1 . Stowing the Pen while Using Touch . 2 .9 Behavior B9. Finger Lift for Activating Pen Controls.
The tendency of users to stow the pen when performing 10 It was observed that users only activate a pen barrel button
touch gestures on a touch screen of a touch -sensitive com - from the Writing grip (FIG . 1, 102 , 108 , 110 ), and then only
puting device such as the tablet is obvious . Users typically with the index finger. Users hold the pen still when tapping
only put the pen down when they anticipate they will not the button . The thumb is also potentially available for
need it again for a prolonged time, or if they encounter a task controls from the Palm - like Thumb Slide grip ( FIG . 4 , 404 ).
that they feel is too difficult or awkward to perform with 15 2 . 10 Behavior B10 . External Precision Grip .
pen -in - hand , such as typing a lot of text using the on - screen Users employ an External Precision grip ( FIG . 4 , 402 ),
keyboard . with the pen held toward the fingertips and perpendicular to
2. 2 Behavior B2. Tuck Vs. Palm for Stowing the Pen . the writing surface, for precise pointing at a small target .
There are two distinct grips that users employ to stow the This provides the possibility to provide contextual enhance
pen . These are a Tuck grip (pen laced between fingers ) 20 ments , such as automatically zooming the region of the
shown in FIG . 1 104 and a Palm grip (with fingers wrapped tablet screen under the pen tip , when this grip is detected .
lightly around the pen barrel) shown in FIG . 1 106 . Users 2 . 11 Behavior B11 . Passing Grip .
stow the pen during pauses or to afford touch interactions . Passing prehension is observed when participants pass the
2 .3 Behavior B3. Preferred Pen Stowing Grip Depends on pen and touch - sensitive computing device (e . g ., tablet ) to
Task Context. 25 another person . Users tend to hold the device securely , in
For users that employ both Tuck grips 104 and Palm grips more of a power grip , and extend it from their body while
106 , a Tuck grip affords quick , transient touch interactions, keeping it level, so that their intent is clear and so that the
while a Palm grip is primarily used if the user anticipates a other person can grab it from the far side .
longer sequence of touch interactions . Other users only use Having described these natural behaviors, the following
Tuck grips 104 to stow the pen . 30 sections describe how the recognition of all of these grips ,
2 .4 Behavior B4 Grip Vs. Pose . touches and motions are used to leverage these behaviors in
For each grip — that is , each way of holding the pen — a order to provide context- appropriate tools for carrying out a
range of poses where the pen orientation is changed occur, user 's intended actions.
often by wrist supination (i.e . turning the palm upward ). 3.0 Introduction to the Pen and Computing Device Sensor
Human grasping motions with a pen therefore encompass 35 Correlation Technique :
the pattern of hand contact on the barrel, as well as the 3D The pen and computing device sensor correlation tech
orientation of the pen . As shown in FIG . 1 , full palmar n ique implementations described herein contribute cross
supination 112 is observed for the Tuck grip 104 and Palm device synchronous gestures and cross -channel inputs for a
grip 106 , but only half-supination 110 for the Writing grip . touch - sensitive computing device /pen (e .g ., tablet- stylus )
2 . 5 Behavior B5 . Extension Grips for Touch . 40 distributed sensing system to sense the naturally occurring
As shown in FIGS. 2 and 3 , many Extension Grips exist user behaviors and unique contexts that arise for pen + touch
where users extend one or more fingers while holding the interaction . Note , however, that while much of the discus
pen to make contact with a touch screen . These were sion here focuses on pen / tablet interactions, other pen - like
classified broadly as single - finger extension grips (FIG . 2 , mechanical intermediaries or small wearable devices can
200 ) vs .multiple - finger extension grips (FIG . 3 , 300 ), which 45 enable context - sensing while interacting with tablets using
users can articulate from either the Tuck or the Palm grip . variations of the pen and computing device sensor correla
(Note that, while not illustrated , three -finger extension grips tion technique . A small motion - sensing ring worn on the
are also possible from some grips ). index finger, for example , could sense when the user taps the
2 .6 Behavior B6 . Variation in Pen Grips. screen with that finger versus, another digit . Watches , sen
Users exhibit many variations in a tripod grip for writing 50 sors worn on the fingertip or fingernail, bracelets, arm
which leads to variations in users' resulting Tuck , Palm , and bands, bandages or wraps , elbow pads , braces , wrist -bands ,
Extension grips. For example, one user's style of tucking led gloves augmented with sensors , subcutaneous implants, or
her to favor her ring finger for single-touch gestures ( see even e -textile shirt sleeves with embedded sensors, represent
Tuck -Ring Finger Extension Grip (FIG . 2 , 206 )) . other similar examples that would enable and suggest related
2 . 7 Behavior B7 . Consistency in Grips . 55 techniques to individuals skilled in the art. Likewise , other
Each user tends to consistently apply the same pen grips manual tools such as a ruler, compass, scalpel, tweezer,
in the same situations. Users also tend to maintain whatever stamp, magnifying glass , lens, keypad , calculator, french
grip requires the least effort, until a perceived barrier in the curve , shape template , paint -brush , or airbrush could serve
interaction (such as fatigue or inefficiency ) gives them an as pen - like implements, whether held in the preferred or
incentive to shift grips . Users switch grips on a mobile 60 non -preferred hand , that also enable related techniques.
computing device ( e . g ., tablet) more often when sitting than Implementations of the pen and computing device sensor
standing, perhaps because there are few effective ways to correlation technique described herein employ grip and
hold or re - grip such a device while standing. touch sensing to afford new techniques that leverage how
2 .8 Behavior B8. Touch Screen Avoidance Behaviors. users naturally manipulate these devices. Implementations
Users often adopt pen grips and hand postures, such as 65 of the pen and computing device sensor correlation tech
floating the palm above a touch screen while writing, or n ique can detect whether the user holds the pen in a writing
splaying out their fingers in a crab -like posture, to avoid grip or palmed and /or tucked between his fingers in a stowed
US 10 , 168 ,827 B2
mode . Furthermore , pen and computing device sensor cor- sion . Further, the sensor pens or touch - sensitive pens
relation technique implementations can distinguish bare described herein can be adapted to incorporate a power
handed inputs , such as drag and pinch gestures produced by supply and various combinations of sensors . For example ,
a user 's non -preferred hand , from touch gestures produced there are various possibilities of incorporating power into
by the hand holding the touch - sensitive pen which neces- 5 the pen , such as by inductive coupling, a super capacitor
sarily imparts a detectable motion signal to the pen . Imple - incorporated into the pen that recharges quickly when the
mentations of the pen and computing device sensor corre - pen comes in range or is docked to or placed on /near a
lation technique can sense which hand grips the touch - computing device , a battery incorporated in the pen , obtain
sensitive computing device ( e. g., tablet), and determine the ing power via pen tether, or acquiring parasitic power via
screen ' s relative orientation to the pen and use the screen ' s 10 motions of the pen . The power supply may feature automatic
orientation and touch patterns to prevent accidental screen low -powermodes when the pen is not moving or not being
content rotation . By selectively combining sensor signals held . The sensors may inform this decision as well. Various
from the touch - sensitive pen and the touch - sensitive com - combinations of sensors can include , but are not limited to ,
puting device and using them to complement one another, inertial sensors , accelerometers , pressure sensors, grip sen
implementations of the pen and computing device sensor 15 sors , near - field communication sensors , RFID tags and / or
correlation technique can tailor user interaction with them to sensors , temperature sensors , microphones , magnetometers ,
the context of use , such as, for example, by ignoring capacitive sensors , gyroscopes , sensors that can track the
unintentional touch inputs while writing , or supporting position of a device , finger print sensors, galvanic skin
contextually - appropriate tools such as a magnifier for response sensors , etc ., in combination with various wireless
detailed stroke work that appears when the user pinches with 20 communications capabilities for interfacing with various
the touch -sensitive pen tucked between his fingers . computing devices. Note that any or all of these sensors may
Implementations of the pen and computing device sensor be multi -axis or multi -position sensors ( e . g ., 3 -axis accel
correlation technique , as described herein , use a touch erometers , gyroscopes , and magnetometers). In addition , in
sensitive pen enhanced with a power supply (e. g ., battery ) various implementations , the touch -sensitive pens described
and multiple sensors ( e . g ., a sensor pen ) to enable a variety 25 herein have been further adapted to incorporate memory
of input techniques and commands based on the correlated and / or computing capabilities that allow them to act in
grip patterns of a user holding the touch - sensitive pen and combination or cooperation with other computing devices ,
touch contacts and grips on the touch - sensitive computing other touch -sensitive pens, or even as a standalone comput
device ( e . g ., a touch -sensitive tablet computer ) and associ- ing device .
ated motions and orientations of these devices. For example, 30 Implementations of the pen and computing device sensor
pressure sensors can be used to detect the user ' s grip patterns correlation technique are adaptable for use with any touch
on the sensor pen and touch and grip patterns on the sensitive computing device having one or more touch
touch - sensitive computing device . Implementations of the sensitive surfaces or regions ( e . g ., touch screen , touch
pen and computing device sensor correlation technique sensitive bezel or case, sensors for detection of hover-type
correlate sensor pen grips and touch - sensitive computing 35 inputs, optical touch sensors, etc .). Note that touch -sensitive
device touches and grips to determine the intentions of the computing devices include both single - and multi-touch
user and the context in which the user wishes to use the devices. Examples of touch -sensitive computing devices can
touch - sensitive pen or the touch -sensitive computing device . include , but are not limited to , touch - sensitive display
This is based on naturally occurring user behaviors such as, devices connected to a computing device, touch -sensitive
for example , whether a user is gripping either device with 40 phone devices , touch -sensitive media players , touch - sensi
their preferred hand or their non -preferred hand. The deter - tive e -readers , notebooks , netbooks, booklets (dual- screen ),
mined user intentions and context of use are then used to tablet type computers , or any other device having one or
generate context- appropriate commands and capabilities for more touch - sensitive surfaces or input modalities. The
the touch -sensitive computing device and /or the touch touch - sensitive region of such computing devices need not
sensitive pen . 45 be associated with a display, and the location or type of
The term pressure as described herein , as relating to contact- sensitive region ( e. g . front of a device on the display ,
pressure sensors and the like, may refer to various sensor versus back of device without any associated display ) may
types and configurations. For example , in various cases and be considered as an input parameter for initiating one or
implementations , pressure may refer to pen tip pressure more motion gestures (i. e ., user interface actions corre
exerted on a display . In general, pen tip pressure is typically 50 sponding to the motion gesture ).
sensed by some type of pressure transducer inside the pen , The term “ touch ” as used throughout this document will
but it is also possible to have the pen tip pressure sensing generally refer to physical user contact (e. g ., finger, palm ,
done by the display/digitizer itself in some devices . In hand , etc .) on touch sensitive displays or other touch sen
addition , the term pressure or pressure sensing or the like sitive surfaces of a computing device using capacitive
may also refer to a separate channel of sensing the grip 55 sensors or the like. However, some touch technologies
pressure of the hand (or fingers ) contacting an exterior incorporate some degree of non -contact sensing, such as the
casing or surface of the touch -sensitive pen or touch - use of highly sensitive self-capacitance detectors to detect
sensitive computing device . Various sensing modalities the geometry of the fingers , pen , and hand near the display
employed by the pen and computing device sensor correla as well as the pen -tip hover sensing. Arrays of IR sensor
tion technique may employ both types of pressure sensing 60 emitter pairs or sensor -in -pixel display elements can also be
(i.e ., pen tip pressure and grip pressure ) for initiating various deployed on pens, tablets , and keyboards for this purpose .
capabilities and commands . Hence touch and grip may incorporate such non - contact
Various devices used to enable some of the many imple- signals for a holistic or unified notion of “ grip ” detection as
mentations of the pen and computing device sensor corre - well .
lation technique described herein include pens, pointers, pen 65 In addition , pen and computing device sensor correlation
type input devices , etc ., that are often referred to herein as technique implementations can use a variety of techniques
a sensor pen or touch - sensitive pen for purposes of discus- for differentiating between Valid and invalid touches
US 10 , 168 ,827 B2
10
received by one or more touch -sensitive surfaces of the tive sensors , gyroscopes , IR or capacitive proximity sensors ,
touch - sensitive computing device . Examples of valid finger print sensors galvanic skin response sensors , etc .) and
touches and contacts include user finger touches ( including provides that sensor input to a grip and touch determination
gesture type touches ), pen or pen touches or inputs , hover module 516 . Similarly a computing device touch input
type inputs , or any combination thereof. With respect to 5 module 514 receives input from one or more sensors of the
invalid or unintended touches , pen and computing device touch -sensitive computing device 504 ( e . g ., inertial, accel
sensor correlation technique implementations disable or e rometers , pressure , touch , grip , near - field communication ,
ignore one or more regions or sub - regions of touch - sensitive RFID , temperature , microphones, magnetometers , capaci
input surfaces that are expected to receive unintentional tive sensors, gyroscopes , IR or capacitive proximity sensors ,
contacts , or intentional contacts not intended as inputs , for 10 finger print sensors, galvanic skin response sensors , etc .) and
device or application control purposes . Examples of contacts provides that sensor input to a grip and touch determination
that may not be intended as inputs include, but are not module 516 .
limited to , a user ' s palm resting on a touch screen while the The grip and touch determination module 516 determines
user writes on that screen with a pen or holding the com - the grip of a user on the sensor pen 502 based on the contact
puting device by gripping a touch sensitive bezel, etc . 15 of a user ' s hand on touch - sensitive surfaces of the sensor pen
The pen and computing device sensor correlation tech - ( and /or the orientation of the pen — yaw , pitch roll or some
nique implementations provide a number of advantages subset of that — and /or other information from the sensors).
relating to pen -based user interaction with a touch - sensitive For example , the sensor signals from the user ' s grip on the
pen and touch -sensitive computing devices , including, but pen can be compared to a database 518 of grip patterns in
not limited to : 20 order to determine the grip pattern or patterns of a user
Novel solutions that sense grip and motion to capture the gripping the pen . In one implementation a trained classifier
full context of pen and touch - sensitive computing is used to classify the sensor signals into grip patterns on the
device (e . g . tablet ) use . sensor pen 502 based on grip training data . Note that this
Using sensors to mitigate unintentional touch ( from the grip training data may be ( 1) for a large sample of many
palm , or from the thumb when picking up the device ), 25 users ; ( 2 ) adapted or trained based only on inputs from the
but also to promote intentional touch by a non - pre - specific user; and ( 3 ) someweighted combination of the two .
ferred hand , or via extension grips to interleave pen and Also , separate databases based on salient dimensions of the
touch inputs. input (e .g . left -handed vs . right-handed user, size or type of
Novel contextually - appropriate tools that combine grip , the device being used , current usage posture of the device ,
motion , and touch screen contact, including , for 30 whether on a desk , held -in -hand , resting on the lap , and so
example , distinct tools for bare -handed input, pinch forth )may trigger the use of separate databases optimized in
input while tucking the pen , and drafting tools that the whole or in part to each use - case . Similarly , the grip and
user can summon with the non -preferred hand when the touch determination module 516 determines the touch of the
pen is poised for writing. user on a display of the touch - sensitive computing device
3 . 0 Exemplary System : 35 504 based on the signals of contact of the user' s fingers or
The pen and computing device sensor correlation tech - hand on a display of the touch -sensitive computing device
nique implementations operate, in part, by correlating sensor (and /or the orientation of the device and /or other informa
inputs from a touch -sensitive pen and a touch -sensitive tion from the sensors ). Additionally , the grip and touch
computing device to trigger various actions and capabilities determination module 516 can determine if the user is
with respect to either the touch - sensitive pen or the touch - 40 gripping touch -sensitive surfaces of the case of the touch
sensitive computing device or both . sensitive computing device . For example , the sensor signals
FIG . 5 provides a diagram of an exemplary system 500 from the user 's grip on the case of the touch -sensitive
that illustrates program modules for implementing various computing device can be compared to a database 518 of grip
implementations of the pen and computing device sensor patterns in order to determine the grip pattern or patterns of
correlation technique . More specifically, FIG . 5 shows a 45 the user gripping the device . In one implementation , one or
touch - sensitive pen or sensor pen 502 in communication more touch - sensitive sensors report an image of what parts
with touch -sensitive computing device 504 via communica - of the case are being touched . Various image processing
tions link 506 . As discussed in further detail herein , the techniques can be used to interpret the image and deduce the
sensor pen 502 can include a variety of sensors . A sensor grip . In one implementation, a (multi- touch , capacitive ) grip
module 508 in the sensor pen 502 monitors readings of one 50 pattern is sensed on the case of the device ( e .g ., the case
or more of those sensors, and provides them to a commu- incorporates a matrix of capacitive sensors ) and motion
nications module 510 to be sent to the touch -sensitive signals and orientation of the touch - sensitive computing
computing device 504 (or possibly another computing device (and/or pen ) are also fed into this determination . In
device (not shown) that performs computations and provides some implementations, if the touch screen has non -contact
inputs to the touch -sensitive computing device 504 ). It 55 proximity sensing capability , then sensing proximity at the
should be noted that in another implementation some ( or all) screen edges of the device can serve as a good proxy for grip
computation may be done directly on the pen before sending sensing on the case . In one implementation a trained clas
(i. e ., grip recognition with machine learning ), and some data sifier is used to classify the sensor signals into grip patterns
might not always be sent to the touch -sensitive computing on the case of the touch - sensitive computing device based
device (i. e ., if the outcome is used local to the pen ). In some 60 on grip training data .
instances , the touch -sensitive computing device may send The grip and touch determination module 516 correlates
information to the pen instead of, or on top of, the pen the sensor inputs from the sensors of the sensor pen 502 and
sending data to the touch -sensitive computing device . the touch -sensitive computing device 504 to associate how
A sensor pen inputmodule 512 receives input from one or the user is gripping the pen with how the user is interacting
more sensors of sensor pen 502 (e . g ., inertial, accelerom - 65 with the screen or the case of the touch - sensitive computing
eters , pressure , touch , grip , near -field communication , device . This correlated data can be used to determine the
RFID , temperature, microphones, magnetometers, capaci user 's preferred hand/non - preferred hand of a user touching
US 10 , 168 ,827 B2
11 12
the sensor pen and/or the touch sensitive computing device . any particular motion gesturesmay consider these different
In one implementation , the preferred hand is distinguished ranges of the sensor pen , in combination with any other
from the non -preferred hand due to the pen motion . A bump correlated inputs, touches , and/ or motions of the computing
in the pen motion is measured when a part of the preferred device .
hand comes in contact with the touch screen of the com - 5 In some implementations the raw sensor readings can be
puting device . After the contact, it can also be continuously reported or transmitted from the sensor pen 502 to the
confirmed that the pen motion correlates to the touch screen computing device 504 for evaluation and characterization by
motion in order to confirm that the hand holding the pen is the computing device . For example , raw sensor data from
being held in the user 's preferred hand . In some implemen - inertial sensors within the sensor pen 502 can be reported by
tations it is not necessary for the pen to register a bump when 10 the sensor pen to the touch - sensitive computing device 504 ,
it touches the screen ( e .g ., if the touch to the screen is very with the computing device then determining pen orientation
subtle or soft) in order to determine the user ' s preferred hand as a function of the data from the inertial sensors . Alter
or non -preferred hand as long as correlating motions are nately , in various implementations, the sensor pen 502 uses
observed . onboard computational capability to evaluate the input from
The determined grip and touch patterns, as well as other 15 various sensors . For example , sensor data derived from
correlated sensor inputs , can also be input into a context inertial sensors within the sensor pen 502 can be processed
determination module 520 . The context determination mod - by a computational component of the sensor pen to deter
ule 520 determines the user 's intent and the context of the mine pen orientation , with the orientation of tilt then being
actions that the user is intending from the correlated grip reported by the sensor pen to the computing device 504 .
patterns , touch patterns and other sensor data . Context 20 Any desired combination of reporting of raw sensor data
examples include, but are not limited to , how many users are and reporting of processed sensor data to the computing
interacting with a pen or touch -sensitive computing device, device by the sensor pen 502 can be performed depending
how many devices are being interacted with , whether a user upon the computational capabilities of the sensor pen . How
is holding the sensor pen or the computing device in the ever , for purposes of explanation , the discussion herein
user ' s preferred vs . non -preferred hand , individual or rela - 25 generally refers to reporting of sensor data to the touch
tive motions of the pen or the computing device, how the sensitive computing device 504 by the sensor pen 502 for
touch - sensitive pen or the touch - sensitive computing device further processing by the computing device to determine
is being gripped or touched , application status, pen orien various commands, motion gestures or other input scenarios.
tation , touch -sensitive computing device orientation , rela In some implementations the user 's touch on the touch
tive orientation of the sensor pen to the computing device, 30 sensitive screen of the computing device is correlated to a
trajectories and /or accelerations of the touch -sensitive pen bump to the gyro or accelerometer in the pen in order to
and the touch - sensitive computing device , identity of the determine what the user is doing . In fact , many implemen
user , and so forth . This context data can be sent to a metadata tations of the technique use correlations between touch ,
labeling module 524 which can be used to semantically label bump onto a pen and pen grip to determine what the user' s
this data . 35 intentions are . A few examples of various motion gestures
The touch / grip patterns and other associated sensor read and capabilities enabled by the pen and computing device
ings and any associated context data are input into the sensor correlation technique are briefly introduced below .
command initiation module 522 . The command initiation For example , one such input technique , referred to as a
module 522 evaluates the available inputs to trigger one or “Magnifier” or “ Loupe tool” (FIG . 5 , 530 ), uses a sensor
more commands or capabilities on the touch - sensitive com - 40 input from the user ' s grip on the touch - sensitive pen 502 to
puting device and /or the sensor pen in order to assist in discern that the pen is held in a tucked position in the user 's
carrying out the user 's intent in a more efficient and user hand . Concurrently , the user 's touch on the touch -sensitive
friendly manner by triggering or activating one or more screen of the computing device is registered to that of the
commands or motion gestures on the touch -sensitive pen user making a two - finger touch gesture such as a pinching
502 and /or the touch sensitive computing device 504 . 45 motion relative to the screen . These two sensed inputs are
Examples of variousmotion gestures triggered or commands correlated so that the pinching motion with the hand holding
(526 , 528 , 530 , 532 , 534 , 536 , 538 , and 540 ) which will be the pen brings up the Magnifier /Loupe tool 530 . The ratio
discussed in detail below are activated by the command nale is that since the user demonstrates the intent to use the
initiation module 522 . Implementations of the pen and pen again soon by the fact the pen is stowed only tempo
computing device sensor correlation technique allow user - 50 rarily and still at the ready, touch interactions with the
defined motion gestures 542 to be defined via a user inter - pen - holding -hand should emphasize capabilities in support
face that allows the user to define one or more capabilities of the pen . The Magnifier/Loupe tool 530 is advantageous in
or motion gestures using sensor pen grips in combination that it supports and emphasizes a quick zooming that affects
with touch inputs and /or grips on the touch - sensitive com - a local area of the content displayed on the screen only,
puting device 504 . Many of the motion gestures or capa - 55 which is especially well suited to detail work with the pen .
bilities 526 through 540 are described in further detail later In some implementations the user' s touch on the touch
in this document, with examples of many of these motion sensitive screen of the computing device is correlated to a
gestures and capabilities being illustrated in FIG . 11 through bump to the gyro or accelerometer in the pen in order to
FIG . 17 . determine the user 's intent.
With respect to hover range , in various implementations, 60 A related gesture referred to as a “ full -canvas pan /zoom "
the pen and computing device sensor correlation technique gesture ( FIG . 5 , 532), in one implementation uses a sensed
considers distance of the sensor pen 502 above the digitizer user 's two - finger touch gesture such as a pinching motion on
of the touch - sensitive computing device 504 . While a variety the display of the touch - sensitive computing device 504
of ranges can be considered , in various tested implementa - (e . g ., tablet ) with the user 's non - preferred hand to trigger a
tions , three range categories were considered , including: 65 standard full canvas zoom of content displayed on the
physical contact, within hover range of the digitizer, or display of the computing device . This functionality is pro
beyond range of the digitizer. The activation mechanism for vided when the sensor pen 502 is not at the ready ( e. g ., when
US 10 , 168 ,827 B2
13 14
the sensor pen is held in a Tucked grip or a Palmed grip in Some implementations of the pen and computing device
the user ' s preferred hand , or the user is not holding the sensor correlation technique use capacitive grip sensing on
sensor pen at all ). The non -preferred hand gesture , then , can the back and sides of the case of the touch - sensitive com
be recognized as such due to the lack of corresponding puting device 504 to detect a number of additional contacts.
motion in the pen . Some implementations of the pen and computing device
Another gesture , referred to herein as a “ pen tools” sensor correlation technique can be used in a multiple
gesture (FIG . 5 , 536 ), uses sensors of the sensor pen 502 and user/multiple device mode . For example , in some imple
the touch -sensitive computing device 504 to detect that the mentations a grip of a primary user on a touch - sensitive
user is holding the sensor pen in a tucked position ( e . g ., computing device 504 and a grip of a secondary user on the
Tucked grip in the user ' s preferred hand and to detect a 10 touch - sensitive computing device are sensed and correlated .
concurrent contact such as a tap from a finger of the user 's The grips can be sensed , for example , by touch -sensitive
preferred hand on the touch -sensitive display of the com - surfaces on the touch - sensitive computing device or the
puting device 504 . The correlation of these two actions grips can be sensed by determining each user' s galvanic skin
brings up a menu of pen - specific tools . In some implemen - response and the differences in the galvanic skin response
tations a different palette of pen tool options can be dis - 15 can be used to tell one user from the other. Other sensor data
played when the pen is held in a fully palmed position in the can also be used , such as , for example , accelerometer data ,
user' s preferred hand. position data , trajectory data and so forth . The grips of the
A similar gesture , referred to herein as a “ canvas tools ” primary and secondary users are evaluated to initiate a
gesture ( FIG . 5 , 538 ) uses sensors of the pen and on the command in an application executing on the touch -sensitive
touch - sensitive computing device to detect when a user is 20 computing device 504 . The correlated grips and the orien
holding the touch - sensitive pen 502 in a non -writing posi- tation of the touch - sensitive computing device 504 can be
tion in the user ' s preferred hand and to detect a contact such evaluated to determine that the grip of the secondary user
as a finger tap with the user 's non -preferred hand on the represents a handoff of the computing device to the second
touch screen of the computing device 504 . These correlated ary user from the primary user. In this case one or more
concurrent actions cause a menu of canvas tools to be 25 capabilities of the touch - sensitive computing device may be
displayed on the display of the computing device . For restricted following the handoff. Alternately , the grip of the
example , this menu of tools could include undo/redo, cut secondary user can be determined to be concurrent with the
copy -paste , new page, search and similar commands. grip of the primary user. In this case , a sharing mode can be
Another gesture , referred to herein as a " drafting tools ” entered on the computing device . For example , the second
gesture (FIG . 5 , 540 ) uses sensors of the pen 502 and on the 30 ary user may only be allowed to view and markup only
touch - sensitive computing device 504 to detect that the content that is currently displayed on a display of the
touch - sensitive pen is held in a writing grip in the user 's computing device. There are many instances where the user
preferred hand and to detect a contact such as a one finger picks up and holds a pen with both hands , making the pen
touch to the touch -sensitive display of the computing device unavailable .
504 with the user ' s non -preferred hand. The one finger tap 35 Implementations of the pen and computing device sensor
with the user 's non -preferred bare hand brings up a set of correlation technique use grip to sense with which hand the
drafting tools when the pen is ready to write . These special user is holding the touch - sensitive computing device 504 .
tools support the use of the pen . Such tools might be , for Implementations of the pen and computing device sensor
example , a compass to draw arcs or an airbrush to color correlation technique then use this to summon a “ Thumb
content on the screen . 40 Menu ” (FIG . 5 , 534 ) at the appropriate side of the touch
Additionally , an “ advanced drafting tools ” gesture ( FIG . sensitive computing device 504 , which advantageously
5 , 528 ) uses sensors on the pen 502 and sensors on the allows the user to activate various buttons and menus
touch - sensitive computing device 504 to detect that the directly with the thumb. If the user grasps the touch
sensor pen is held in a writing grip in the user ' s preferred sensitive computing device with a second hand, implemen
hand and to detect a two - finger touch gesture such as a 45 tations of the pen and computing device sensor correlation
pinching motion at the touch -sensitive display of the com - technique leave the Thumb Menu visible at the side where
puting device with the user ' s non - preferred hand . The pinch - it first appeared.
ingmotion with the user ' s non - preferred hand brings up a set Similar to the touch - sensitive computing device , in some
of advanced drafting tools when the pen is ready to write . implementations of the pen and computing device sensor
These special tools further support use of the pen that 50 correlation technique a grip of a primary user and the grip of
benefits from a second touch . For example , these advanced a secondary user on a touch -sensitive pen 502 can be sensed .
drafting tools can include a ruler or alignment edge, a French As discussed previously , other data such as acceleration ,
curve or a function to pull a new sheet of paper (with position and trajectory data , for example , may also be
two- finger position , rotation , and/or scaling possibly, considered . The grips of the primary and secondary users
depending on the particular advanced drafting tool 55 can then be correlated to initiate a command in an applica
employed ). tion executing on the touch -sensitive pen 502 . For example,
Other examples of correlated sensor pen motions relative the grip of the secondary user can be determined to represent
to the computing device include using pen sensors ( e . g ., a handoff of the pen 502 to the secondary user from the
accelerometers , pressure sensors , inertial sensors , grip sen - primary user . In this case data can be transferred from the
sors , etc .) to determine when the sensor pen is picked up or 60 primary user to the secondary user via the handoff of the pen
put down by the user. By considering the current sensor grip 502 . The secondary user can then download the transferred
pattern (i. e ., tucked in the user' s preferred hand, ready to data to a computing device for example a different com
write in the user' s preferred hand , put down ) correlated with puting device that the stylus was originally used with .
the grip patterns of the user on the computing device ( e . g ., Alternately , one or more capabilities of the touch - sensitive
not held by the user, touching the display with the user's 65 pen 502 can be restricted following the handoff .
non - preferred hand or non - preferred hand, etc .), appropriate Besides grip patterns on the sensor pen 502 and on the
commands can be initiated . computing device 504 , some implementations consider
US 10 , 168 ,827 B2
15 16
motion of the computing device and the sensor pen . For of the touch - sensitive pen , and it is assumed that the pen is
example , implementations use themotions of the sensor pen held in the preferred hand ). A context-appropriate user
502 and the computing device along with the grip patterns interface action is initiated based on a combination of the
to determine if the pen is held in the user 's preferred hand grip pattern on the touch -sensitive pen and the touch pattern
( or non -preferred hand). The user's preferred hand and 5 on the touch -sensitive computing device, as shown in block
non -preferred hand can be determined from the correlated 606 . For example, the determination of how the pen device
grip patterns and associated information . For example ,when is being touched by the user ' s preferred hand or non
a motion signal representing a pen bump occurs at the same preferred hand can be used to infer the context of use and /or
time as a new contact on the touch -sensitive computing user intent in order to initiate a context-appropriate com
device the grips/ touches on both the sensor pen 502 and the 10 mand or capability such as those described with respect to
touch - sensitive computing device 504 are correlated and FIG . 5 .
specific motion gestures or commands are initiated based on FIG . 7 depicts another exemplary implementation 700 for
these recognized grip patterns. practicing the pen and computing device sensor correlation
Further, in various implementations, the pen and comput
ing device sensor correlation technique also advantageously 15 technique in order to find meta information . As shown in
rejects or ignores unwanted or unintended touches by a user. block 702 , the signals of contacts on two or more touch
A palm rejection module (FIG . 5 , 526 ) can be used for this sensitive devices by a single user are correlated . For
purpose. In particular, the palm rejection module 526 evalu example , one of the two or more touch -sensitive computing
ates any touch to determine whether that touch was intended devices can be a sensor pen and one could be a tablet
or it was made by a palm inadvertently resting on the 20 computing device . The context of the contacts on the two or
touch - sensitive computing device , and then either accepts more touch -sensitive devices is determined based on the
that touch as input for further processing, or rejects that correlation of the signals, as shown in block 704 . The
touch . In addition , in various implementations, the palm determined context of the contact is labeled as metadata for
rejection module disables or ignores (i.e ., " rejects ” ) user use in an application , as shown in block 706 . For example
palm touches on or near particular regions of any touch - 25 the context meta data could include, but is not limited to ,
sensitive surfaces, depending upon the context of that touch . how many users are interacting with a pen or touch - sensitive
Note that " rejected ” touchesmay still be handled by the pen
and computing device sensor correlation technique as an
computing device, how many devices are being interacted
input to know where the palm is planted , but flagged such with , whether a user is holding the sensor pen or the
that unintentional button presses or gestures will not be 30 computing
relative
device in the user 's preferred hand , individual or
motions of the pen or the computing device ,how the
triggered in the operating system or applications by acci sensor pen or the touch -sensitive computing device is being
dent . In some implementations the pen and computing
device sensor correlation technique is able to track the palm gripped or touched , pen orientation , touch -sensitive com
( for example using an assigned contact identifier ) when the puting device orientation , relative orientation of the sensor
contact is moving ( as long as the contact is touching ) 35 pen to the computing device, and so forth . If finger print
Furthermore, in some implementations if new contacts are sensors are available, fingerprints can be used to determine
detected within a given radius of the palm contact, they
they will
will which user is holding a device , which finger(s ) the user is
touching the device with , if the user is holding the device
also be labeled as palm contacts (e. g., knuckles ) and
ignored . Similar to the palm rejection module 526 , a thumb with his preferred hand and in which grip , among other
contact rejection module 544 can also be enabled . 40 things. The finger print sensor could also be used to recog
4 . 0 Exemplary Processes nize the user by his or her finger print to establish a preferred
An exemplary system for practicing implementations of mode for the user, for example . The metadata can be further
the pen and computing device sensor correlation technique used to initiate a context- appropriate user interface action , as
having been described , the following section discusses shown in block 708 , or it can be used for some other
exemplary processes for practicing various implementations 45 purpose . It should be noted that a similar process could be
of the pen and computing device sensor correlation tech - advantageously used to find and label metadata for more
nique . than one user .
FIG . 6 depicts an exemplary process 600 for practicing FIG . 8 depicts yet another exemplary implementation 800
the one implementation of the technique . As shown in block for practicing the pen and computing device sensor corre
602 , a combination of concurrent sensor inputs from a 50 lation technique . As shown in block 802 when a pointing
touch - sensitive pen device and a touch - sensitive computing device comes within a predetermined range of a touch
device are received . These concurrent sensor inputs include sensitive surface of a computing device the presence of the
one or more sensor inputs based on grip patterns on the computing device is sensed . For example, the pointing
touch - sensitive pen and one or more sensor inputs based on device could be a touch - sensitive pen . Control of an input on
touch patterns on the touch - sensitive computing device . In 55 a display screen of the computing device is started using
some cases there may be no touch pattern on the pen or the motions of the pointing device when the pointing device is
touch - sensitive computing device if the user is not touching within the predetermined range , as shown in block 804 .
or gripping one of these two items. The sensor inputs are Control of the input on the display screen is continued using
used to determine how the touch -sensitive pen is being motions of the pointing device when the pointing device is
gripped concurrent with how the touch - sensitive computing 60 outside of the predetermined range , as shown in block 806 .
device is being touched , as shown in block 604 . For It should be noted that hover sensing is innate to electro
example , the grip patterns and touch patterns are evaluated magnetic digitizers , which sense the x ,y location of the
to determine if the user is using the user ' s preferred hand or pointer tip ( e . g ., pen tip when the pointer ( e . g ., pen ) is close
the user 's non -preferred hand (i. e ., in some implementations enough to the screen . The pointer ( e. g ., pen ) will keep
the dominant/ preferred hand is detected based on correlation 65 reporting its 3D orientation (and motion signals ) via a radio
of the touchdown of the hand on the touch -sensitive com - link . This can also be used to infer some relative motion , or
puting device with a bump in the accelerometer or the gyro to continue cursor control. Some implementations of the pen
US 10 , 168, 827 B2
17
and computing device sensor correlation technique can also This is advantageous in that it allows a user to transfer data
use accelerometer or other sensor data to track the position or objects from one device to one or many other devices .
of the computing device . 5 .0 Details and Exemplary Implementations :
FIG . 9 depicts another exemplary process 900 for prac - An introduction to the pen and computing device sensor
ticing the pen and computing device sensor correlation 5 correlation technique implementations, as well as an exem
technique. A grip of a primary user on a touch -sensitive plary system and exemplary processes for practicing the
computing device is sensed using sensor data ( for example , technique having been provided , the following paragraphs
provide details of various exemplary implementations of the
sensor data shows that the grip is with one hand, the device pen and computing device sensor correlation technique .
is flat and acceleration of the device is toward a secondary
user who is most likely on the opposite side of the primary 10 aAlthough many of the details below make reference to using
user), as shown in block 902 . A grip of a secondary user on understood that thisin iscombination
tablet computer with a pen , it should be
just one exemplary device used to
the touch - sensitive computing device is also sensed using simplify the description . The exemplary implementations
sensor data ( for example , sensor data shows a grip of the described below could be used with any touch - sensitive
secondary user on the opposite side of the device ice as
as the 1515 computing device (e .g., phone , desktop digitizer , phablet,
primary user ), as shown in block 904. The grips of the e -reader, electronic whiteboard , vehicle touch - display and
primary and secondary users are correlated to determine the so forthi.
context of the grips ( for example whether they are sharing By evaluating correspondences between touch - screen
the device or they are passing it the other user) as shown in input ( or touch inputs on other surfaces of the computing
block 906 , and to initiate a command in an application 20 device ) and touch -sensitive pen grips , the pen and comput
executing on the touch -sensitive computing device , as ing device sensor correlation technique infers additional
shown in block 908 . As discussed previously the context of information about how the user is touching the screen or
the grips could indicate that the touch -sensitive computing other touch -sensitive surface. By correlating signals from
device is being passed from the primary user to the second touch -sensitive pen grips with signals from bare -handed
ary user. Alternately , the context of the grips could indicate 25 touch inputs on the computing device as well as motions and
that the primary user is sharing the touch -sensitive comput- orientation of the pen and computing device and other
ing device with the secondary user. Depending on the sensor data , the pen and computing device sensor correlation
context different commands and capabilities willbe enabled technique enables a variety of context- appropriate tools and
on the touch -sensitive computing device . For example , a motion gestures to aid a user in completing intended tasks
user ' s intention to share their device could be confirmed , for 30 which can be inferred from the correlated sensor signals .
example , by using a voice command . Note that such correlations may look at correspondences (or
FIG . 10 depicts yet another exemplary process 1000 for non -correspondences, as the case may be) in motion or grip
practicing the pen and computing device sensor correlation sensor data that occurs before , during, and after contact with
technique . In this implementation both the touch - sensitive the touch screen ( for correlations involving the tablet digi
computing device and the touch - sensitive pen are passed 35 tizer ). Information at or close to the time of contact may be
from a primary user to a secondary user. As shown in block used to determine a preliminary assessment of the type of
1002 , sensor inputs representing a grip of a primary user on contact, for example , with the possibility of a more defini
the touch -sensitive computing device and the touch -sensi- tive assessment at a later time as further sensor signals and
tive pen are sensed. At about the same time, the sensor inputs device input events arrive in real-time. In this way , the
representing a grip of the secondary user on the touch - 40 system can provide immediate , or near -immediate , real-time
sensitive computing device and the touch - sensitive pen are response and feedback to the user ' s input events while
also sensed , as shown in block 1004 . The grips of the taking maximum advantage of all the information to deter
primary and secondary users on the pen and the touch - mine as correctly as possible the context of the user ' s
sensitive computing device are correlated to determine the actions.
context of the grips (block 1006 ) and to initiate a command 45 For purposes of explanation, the following discussion will
in an application executing on the touch - sensitive pen or on refer to a sketching or drawing type application in the
the touch - sensitive computing device, as shown in block context of a tablet - type computing device. However, it
1008 . For example , the grips can be determined to represent should be understood that both the sensor pen and the
a handoff of the pen and the computing device to the touch - sensitive computing device ( s ) are fully capable of
secondary user from the primary user. In this case data can 50 interaction and interoperation with any desired application
one or more capabilities of the touch - sensitive pen or the type or operating system type . In other application contexts ,
touch - sensitive computing device can be restricted follow such as , for example , active reading or mathematical sketch
ing the handoff . Likewise , if the sensors and a digitizer on ing, different gestures or mappings can be defined . In fact,
the touch -sensitive computing device indicate that the same as noted above , any desired user-definable gestures and
user is employing the same pen on a different digitizer ( e . g ., 55 concurrent pen -and - touch inputs can be configured for any
on a different computing device ), this can carry state infor - desired action for any desired application , operating system ,
mation ( such as the mode or color/thickness /nib style of the or computing device . Further, it should also be understood
pen ) or files and data (such as the current clipboard contents ) that voice or speech inputs , eye gaze inputs , and user
across to another device . This can be determined by observ - proximity and body posture inputs ( such as provided by a
ing that a user is holding one tablet while writing or touching 60 depth camera ) can be combined with any of the various input
a finger to another, separate tablet, for example . In any of techniques discussed herein above to enable a wide range of
these implementations the determined context can be used to hybrid input techniques .
initiate a command in an application executing on the 5 . 1 Grip + Motion Interaction Techniques
touch - sensitive computing device or on the touch -sensitive Exemplary implementations of the pen and computing
pen . Variations are also possible . For example , a single user 65 device sensor correlation technique were employed in the
may send commands or transfer data to many touch -sensi context of a simple sketching application that supports
tive computing devices using a single touch - sensitive pen . annotation , panning and zooming , and some additional
US 10 , 168, 827 B2
19 20
sketching tools . Some of the capabilities of these implemen - grip , which eliminates many desirable scenarios (e.g . pan
tations are discussed in the paragraphs below . Many other ning and zooming with the nonpreferred hand ), as well as
implementations and combinations of these capabilities are simultaneous pen + touch gestures .
possible . When a user plants his hand on the touch - sensitive screen
5 . 1. 1 Pen Orientation Sensed Relative to a Touch -Sensi- 5 of a computing device ( e . g ., tablet), it simultaneously
tive Computing Device induces a corresponding signal on the pen ' s motion sensors .
In some implementations, inertial sensor fusion allows ( The device ' s motion sensors also pick up some of this
implementations of the pen and computing device sensor motion , but it is damped somewhat by the greater mass of
correlation technique to maintain a common reference frame the device.) Nonetheless, motion signals from the pen and/or
relative to a touch - sensitive computing device such as, for 10 the touch -sensitive computing device (e .g., tablet) may be
example , a tablet. Some implementations of the technique used in combination to help infer the type of contact that
employ a tablet- relative orientation at the grip - recognition
phase , as well as in the interpretation of the pen motion ( such occurs, and may themselves be correlated with one another,
as for the airbrush tool, described in more detail later ). Some such as to sense that the touch - sensitive computing device
implementations of the technique only can sense the orien - 15 ( e . g ., tablet) is being moved while the pen is docked
tation of the pen relative to the touch -sensitive computing (attached ) to it, or not.
device ( e. g ., tablet). In some implementations, inertial sens When the hand holding the pen contacts the touch
ing cannot reliabily determine the (x ,y ) translation or z - al sensitive computing device (e .g ., tablet), the pen ' s motion
titude of the pen without resort to some other absolute exhibits a characteristic hard -contact profile similar to that
external reference . In other implementations, where an 20 seen with bump, whack , and thump gestures in other con
external absolute reference is available , technique imple texts. Implementations of the pen and computing device
mentations may continue to track the absolute (x , y ) and sensor correlation technique look for a peak in the signal
altitude ( z) of the pen tip even when it is beyond the sensing corresponding to the pen 's motion sensors ( for example, by
range of the screen itself, by fusing the inertial sensing using the second order finite difference on the three com
capability with the absolute reference . The absolute refer - 25 bined axes of t an accelerometer or gyro ) that exceeds a
ence may be an extrinsic source (e . g . optical tracking of minimum threshold within a given window , for example a
markings or IR sources on the pen ) or intrinsic sources ( e .g . 10 -sample window . It is known exactly when to look for this
radio signal strength triangulation to determine the approxi- signal because a palm plant produces a bump in the pen ' s
mate distance of the pen radio transceiver from other trans- motion at the same time that the touch screen detects the new
ceivers in the tablet or environment ). In general , wireless 30 contact. Tested implementations of the pen and computing
and radio signal strengths can be used to approximate device sensor correlation technique can identify this peak
position which can be enough (when combined with fine - within as little as 5 ms, or up to a maximum of 56 ms after
grained inertial sensing ) to produce a good estimate of the the touch -down event in the worst case . Although in general
x , y , z position of a device relative to another — this can be the focus here is on palm contact on the screen portion of a
accomplished through the triangulation of multiple sources . 35 device , note that this bump signal can be applied to both
Additionally, in some implementations a depth camera may front and back -of-device hand contacts , i.e. touch signals
be used to track the pen and /or the touch -sensitive comput- produced by the touchscreen , by grip sensors on the casing
ing device . of the tablet , or both . Hence back /side of device contacts can
5 . 1 . 2 Detecting Unintentional Palm Contact differentiate various types of hand and pen contact as well.
Sensing unintentional palm contact on a touch -sensitive 40 Thus implementations of the pen and computing device
surface or screen is a difficult problem because , at the onset sensor correlation technique employ a fairly low threshold
of touch , there is often insufficient information to distinguish for the bump signal, allowing even rather subtle palm
what type of touch is occurring . A palm can be recognized contact to be sensed , while also trivially rejecting other
as a touch with a large contact area, but such contacts motion signals that do not occur coincident to a new touch
typically start small and may take a while to pass a certain 45 screen contact . This detection scheme works well for most
size threshold . Also , some unintentional touches (such as normal pen interactions during writing.
contact produced by the knuckles ) may never turn into a For as long as the detected palm contact persists, imple
“ large ” contact . This strategy therefore necessitates delays mentations of the pen and computing device sensor corre
(introduces lag ) in processing touch events, and stillmay fail lation technique can also flag any new touches as a “ palm ”
to detectmany contacts . 50 if they land within a prescribed radius ( for example , a 300
To increase stability and avoid fatigue , users naturally rest pixel radius ) of the initial contact. In some implementations
their hand on the writing surface , but current tablet users are this may incorporate a model of expected hand contact
forced to adopt touch screen avoidance behaviors. Simply regions, given the user's grip and angle of holding the pen ,
sensing that the user is holding the pen is not sufficient as well as the relative screen orientation , such as the hand
because people stow the pen while using touch and employ 55 occlusion models, among others . One implementation of the
various extension grips to touch the screen . Pen orientation pen and computing device sensor correlation technique
is also insufficient because each grip can be associated with provides feedback of the initial palm detection by playing a
a range of wrist supinations and because users hold the pen brief animation of a “ radar circle ,” centered on the palm
in a variety of ways . down location , that fades as it expands. This advantageously
However, since unintentional touch primarily occurs inci- 60 provides non - distracting feedback that confirms to the user
dent to writing, sensing the Writing grip itself is a powerful that the palm contact was successfully detected . Without this
cue , particularly because the user typically adopts a writing feedback , the user may be left with a nagging uncertainty as
grip prior to resting his hand on the display . Hence , a highly to whether or not their palm has triggered an undesired
conservative palm - rejection procedure can simply reject any action ( such as calling up a menu , or leaving behind an ink
touch that occurs when the pen is held in the Writing grip . 65 trace ) that is currently occluded by the hand. Such feedback
This , however, precludes intentional touches made by the is optional , however , and may be disabled by the user or by
non -preferred hand whenever the pen is held in the writing specific applications .
US 10 , 168 ,827 B2
21
5 . 1.3 Permitting Intentional Touch Magnifier/Loupe Tool to drag it to a new location. A single
Implementations of the pen and computing device sensor finger tap, or pen stroke, anywhere outside of the Magnifier /
correlation technique permit simultaneous intentional Loupe Tool is dismissed , leaving the canvas undisturbed at
touches, even when the palm is resting on the screen . In its original zoom level.
general, in some implementations, any new touch that 5 Note that since some implementations of the pen and
occurs away from the palm — which is outside of a prede - computing device sensor correlation technique employ a
termined radius of a previously detected palm contact and minimum motion threshold to detect the bump signal, if the
which is not accompanied by a bump on the pen ) - repre - user touches their fingers down very lightly the pen may not
sents a true intentional touch . Some implementations of the detect a motion signal sufficient to exceed this threshold .
pen and computing device sensor correlation technique use 10 Nonetheless , these thresholds of the pen and computing
the first two additional touches that are not flagged as a palm device sensor correlation technique are sufficient to detect
contact to support a pinch -to -zoom gesture . Palm contact is the motions produced when users naturally bring their
ignored and does not interfere with pan / zoom , even if the fingers to the screen with the pen is stowed .
palm moves. Other single or multiple finger or whole hand 5 . 1.6 The Drafting Tools
gestures can also be supported . 15 The Drafting Tools (FIG . 13 , 1300 ) capability arose from
However , because some implementations of the pen and the observation that users often maintain the Writing grip
computing device sensor correlation technique still track the between bursts of writing activity . For example, during
palm - rather than outright ‘ rejecting ' it per se this pauses users often rotate the wrist away from the screen , to
approach also can support techniques that use the palm bring the pen into the Writing — Half Supination pose .
location as an input, such as to help to correctly orientmenus 20 Hence , the Writing grip itself represents an interesting
or to anchor objects . context that can be explicitly supported by providing various
5 . 1 . 4 the Magnifier Tool Vs. Full Canvas Zoom drafting tools that take into account that the user is holding
The functionality of the Magnifier/Loupe Tool 1100 the pen in a ready-to -write posture .
shown in FIG . 11 employs a focus -plus-context magnifica - In some implementations of the pen and computing
tion technique (known as the “ loupe ” ) which is especially 25 device sensor correlation technique , the user calls up the
well suited to sketching tasks where the user wants to make Drafting Tools menu explicitly , by a single contact such as
a few detailed strokes without losing the overall context of touching down a single finger of the non-preferred hand
the workspace . Implementations of the pen and computing ( recognized by a single touch without a corresponding bump
device sensor correlation technique support both the Mag - signal on the pen ). If the pen is held in the Writing grip , this
nifier and Full- Canvas Zooming by sensing how the user is 30 brings up a small palette that offers various pen + touch tool
interacting with the pen and touch -sensitive computing modes , including an Airbrush and a Compass and so forth .
device ( e. g ., tablet ). In some implementations of the pen and computing device
When the user stows the pen (in the Tuck or Palm grip ), sensor correlation technique the Drafting Tools menu is
implementations of the pen and computing device sensor invoked as soon as the user touches down his finger. In some
correlation technique recognize this grip . If the user then 35 technique implementations the Airbrush is the initial default
brings two fingers into contact with the display , implemen - mode. In some technique implementations the user can then
tations of the pen and computing device sensor correlation tap on another tool ( such as the Compass ) to change modes .
technique check the pen for a corresponding " bump" that In some implementations , all drafting tools are implemented
occurs at approximately the same time as the touch signal. as spring - loaded modes ; that is, the mode is maintained only
When implementations of the pen and computing device 40 so long as the user holds down his finger. Note also that the
sensor correlation technique see this combined pen bump + Drafting Tools menu , by default, can activate the most
two - finger touch signal, it brings up the Magnifier/ Loupe recently used tool (mode ) when the user initiates contact.
Tool. Note that the two - finger touch signal does not require This makes repeated use of the same tool at multiple
the fingers to touch at precisely the same time; a short grace locations more efficient. Implementations of the Airbrush
period is allowed so that non - simultaneous touches can be 45 tool, the Compass tool, as well as a Single - Tap Virtual Pen
correctly recognized as calling up the Loupe. In some Barrel button are described in more detail below .
implementations a response to a single finger touch is Additionally, in some implementations, an " advanced
delayed slightly for a second finger touch to arrive . In other drafting tools” capability can use sensors on the pen and
implementations a single finger action is started and then sensors on the touch - sensitive computing device to detect
canceled (or undone ) if a second finger is detected during an 50 that the sensor pen is held in a writing grip in the user' s
allotted time window . Spatial constraints on how close ( or preferred hand and to detect a two - finger touch gesture such
far ) the two finger contacts must be can also be applied , if as a pinching motion at the touch -sensitive display of the
desired . computing device with the user 's non - preferred hand. The
If some implementations of the pen and computing device pinching motion with the user's non- preferred hand brings
sensor correlation technique see a two - finger touch without 55 up a set of advanced drafting tools when the pen is ready to
any corresponding bump on the stylus , the implementations write . These special tools further support use of the pen that
instead infer that the user made the touch with their other benefits from a second touch . For example , these drafting
(nonpreferred ) hand , which is not holding the pen . In some tools can include a ruler or alignment edge , a French curve
implementations , this then triggers the standard two -finger or a function to pull a new sheet of paper (with two - finger
pan and zoom interaction to allow Full Canvas Zoom 1200 60 position and rotation ). The user may also move back and
(rather than the focus -plus -contextMagnifier Tool) as shown forth between the Advanced Drafting Tools and the standard
in FIG . 12 . drafting tools by lifting ( or putting back down ) one of the
In some implementations, the Magnifier /Loupe Tool two fingers ; the tool set shown corresponds to the number of
zooms only the region of the canvas under the circular tool. fingers placed on the display .
The Magnifier /Loupe Tool interactively resizes itself 65 5 .1 .6 .1 Airbrush Tool
according to the spread between the user 's two fingers . The One implementation of the Airbrush tool initially shows
user may also touch down a finger on the border of the cursor feedback , as a gray dashed ellipse , of where the
US 10 , 168, 827 B2
23 24
airbrush will spray if the user starts the paint flow . The user 5 .1 .6 .3 Compass Tool
indicates where the airbrush tool should spray by the posi The Drafting Tools palette also includes a Compass,
tion of the (non -preferred hand ) finger. This is necessary which supports a pen + touch mode where the pen is con
because Gas noted previously inertial sensing cannot deter strained to draw circular arcs centered about the current
5 location of the finger (again of the non -preferred hand ).
mine the absolute (x ,y,z ) location of the pen tip above the 51 5 . 1 . 7 The Pen Controls
tablet, only a 3D orientation relative to the tablet. The As another example , a single - finger contact such as a tap
relative sensing of the pen orientation can be demonstrated while the pen is stowed brings up a small in - place palette
by rotating the touch - sensitive computing device (e .g., tab containing the Pen Controls ( FIG . 14 , 1400 ) , allowing the
let), rather than the stylus . user to change modes, or to modify the pen color and stroke
In one implementation , the user controls the spray ellipse " thickness, without making a round- trip to a toolbar docked
by changing the azimuth and elevation angles of the pen at the edge of the workspace . This example again takes
with respect to the touch - sensitive screen /tablet surface . Theadvantage of the bump generated on the pen when the user
user can hold the pen well above the screen , making it easy taps the touch screen from an extension grip , using any
to angle the pen as desired , unlike a previous exploration of a single finger to make touch screen contact . The tools appear
an airbrush - like tool which did not employ the pen tilt 15 menusnext to the finger. The user may then interact with the radial
using either pen or touch , as studies have consistently
angles , likely because it is difficult to reorient the pen while found that users expect pen and touch to be interchangable
also keeping it within the limited – 15 mm proximity sensing for UI controls . Note that in some implementations these or
range of the tablet. In some implementations, a separate related controls may be activated by either a tap (i.e . a
sensing channel (such as a proximity sensor ) may be used to 20 finger-down + finger- up sequence ) or by a tap -and-hold (fin
determine the altitude of the pen above the screen , which ger-down +maintaining finger contact with the screen ). The
then determines the size (extent) of the spray ellipse . latter is particularly conducive to spring - loaded modes,
In some implementations, the user turns the paint flow on which maintained as long as the finger remains in contact
and off by tapping their index finger on the barrel of the pen , with the digitizer.
which is sensed as further detailed below . When the user 25 5 .1.8 Canvas Tools
activates the spray , the feedback changes to a bold red The Canvas Tool ( FIG . 15 , 1500 ) uses sensors of the pen
dashed ellipse to give clear feedback of the shape of the and on the touch - sensitive computing device to detect when
spray being produced . In one prototype , the highly trans a user is holding the sensor pen in a non -writing position in
parent ellipses are " sprayed ” onto the canvas. The user may the user 's preferred hand and to detect when a single contact
also adjust the size of the ellipse , or the transparency of the& 30 such as for example a finger tap with the user 's non
preferred hand on the touch screen of the computing device .
spray , by sliding the finger in contact with the pen barrel as These correlated concurrent actions cause a menu of canvas
if it were a physical slider control. The motion of the finger tools to be displayed on the display of the computing device .
contact can be sensed and tracked by the pen 's grip sensors . For example , this menu of tools could include undo /redo ,
5 . 1 .6 . 2 Single - Tap Virtual Pen Barrel Button
Implementations of the technique successfully supports 35 Like
cut- copy -paste , new page, search and similar commands .
the pen tools, in some implementations the canvas tools
single - tap activation of a virtual barrel button by strategi and related tools can be activated by either a tap or tap -and
cally combining some or all of the stylus sensing channels. hold gesture , depending on the implementation .
Implementations of the single - tap virtual pen barrel button 5 . 1. 9 Touch -Sensitive Computing Device / Tablet Grip
described herein use grip sensing data in addition to motion 40 Detection
data of the pen . Implementations of the pen and computing device sensor
To identify candidate tap events, some implementations correlation technique use capacitive grip sensing on the back
of the pen and touch - sensitive computing device sensor and sides of the touch -sensitive computing device case to
correlation technique look for a bump signal on the barrel of detect a number of additional contacts and to determine their
the pen from the finger tap at the same time that a new touch 45 contexts .
contact appears on a capacitance image map created by 5 . 1.9 . 1 Thumb Menu and Handedness Detection
using capacitance sensors on the barrel of the pen . However, There are many instances where the user picks up and
this alone may not be sufficient to filter out false positives holds a touch -sensitive computing device ( e . g ., tablet or
produced by re - gripping the pen , because shifting grips can other similar device ) with both hands , making the pen
also produce bump signals coincident with new finger 50 unavailable . Implementations of the pen and computing
contacts . To filter these out, some implementations of the device sensor correlation technique use grip to sense which
technique rely on the observation that users hold the pen still hand the user is holding the touch -sensitive computing
in a Writing grip to maintain a stable tripod grasp when they device with . Implementations of the pen and computing
lift the index finger to tap on the barrel (per observation B9). device sensor correlation technique then use this to summon
This is advantageous because detection of the Writing grip 55 a Thumb Menu ( FIG . 5 , 534 ) at the appropriate side of the
provides a gate to make sure false detections are unlikely to touch - sensitive computing device (e . g ., tablet ), which
occur. Various implementations of the technique therefore allows the user to activate various buttons and menus
look at the ongoing accelerometer and gyro signals and directly with the thumb . If the user grasps the touch
compute a simple time- decaying motion signal to determine sensitive computing device ( e. g ., tablet) with a second hand ,
whether a device is moving or held still. Then only the 60 implementations of the pen and computing device sensor
candidate tap events that occur when the pen is not moving correlation technique leave the Thumb Menu visible at the
are accepted , which effectively filters out any false contacts. side where it first appeared . In other implementations , it may
In one working implementation , the pen must remain in a be split (or duplicated ) across the two thumbs.
new moving (or not moving ) state for at least 100 ms. If the user is observed grasping the pen while holding the
Otherwise , the pen barrel tap itself can trigger brief activa - 65 touch -sensitive computing device with one hand, implemen
tion of a " moving ” signal, which of course would thwart tations of the pen and computing device sensor correlation
recognition of the barrel tap . technique can immediately infer the user 's handedness. The
US 10 , 168 ,827 B2
25 26
hand holding the pen is inferred to be the preferred hand , and grips of the primary and secondary users on the pen are
the hand holding the touch -sensitive computing device (e . g ., correlated to determine the context of the grips and to initiate
tablet) may be presumed to be not the preferred hand . Some a command in an application executing on the touch
technique implementations can also tell if the pen -holding sensitive pen (and/or computing device in communication
hand grabs or grips the touch - sensitive computing device 5 with the pen ). For example , the grip of the secondary user
( e. g ., tablet) by looking for a bump on the pen at the time the can be determined to represent a handoff of the pen to the
user ' s hand contacts a grip sensor on the touch -sensitive secondary user from the primary user. In this case data can
computing device (e. g., tablet). be transferred from the primary user to the secondary user
5 . 1.9 .2 Detecting Unintentional Thumb Contact via the handoff of the pen . Alternately, one or more capa
In some implementations of the pen and computing 10 bilities of the touch -sensitive pen can be restricted following
device sensor correlation technique when the Thumb Menu the handoff.
first appears, it fades in over a short ( e. g., 1 .5 second ) Similarly , FIG . 17 shows a primary user 1702 passing a
interval, and likewise if the user lets go of the touch - touch - sensitive computing device 1704 to a secondary user
sensitive computing device it fades out after a short time 1706 . Some implementations of the pen and computing
(e.g ., 350 ms). The purpose of this animation feedback is to 15 device sensor correlation technique employ sensors to detect
present the Thumb Menu in a tentative state , so that if the the grips of the two users on the touch -sensitive computing
user's thumb strays onto the touch screen while picking up device using the devices 's case grip sensing to determine
the tablet, the thumb contact can be ignored or otherwise when each user is grasping the device , and the device 's
treated as a likely -unintentional input. orientation to determine if it is level. The detected grips and
Implementations of the pen and computing device sensor 20 the orientation of the device (as well as possibly other data )
correlation technique infer that a thumb represents an unin - can be used to assign each user a different role with a
tentional touch if it occurs at the same time as ( or soon after ) different set of permissions in using the touch -sensitive
a new hand grip on the corresponding back portion of the computing device. When these conditions are detected in
tablet case. Some implementations of the pen and computing some implementations a special annotation layer peels over
device sensor correlation technique then detect the thumb as 25 the screen , as if a transparency or a sheet of vellum had been
an unintentional contact, and freeze the fade - in of the dropped over the display. The other user is then free to
Thumb Menu if the unintentional thumb contact overlaps it . annotate the content, but not to make digital copies or
This feedback indicates to the user that the thumb contact navigate to other documents or files. This is a very different
has been recognized , but intercepted to prevent accidental and much more limited form of sharing than the digital
activation of the menu . The user can then intentionally 30 transfer of information supported by other cross - device
interact with the Thumb Menu , if desired , simply by lifting information transfer techniques. Implementations of the pen
the thumb and bringing it back down on the menu . The and computing device sensor correlation technique do not
fade - in animation continues as soon as the user lifts his or trigger Tablet Handoff when a single user holds the display
her thumb. If the user does not place the thumb on the screen up with two hands ; during such interactions, users tend to
when picking up the touch -sensitive computing device, the 35 angle the tablet towards themselves, and thus it is not level.
fade -in also serves as secondary cue that the Thumb Menu In a similar manner, interactions where the device is per
is fully ready for use . Since accidental activation mainly fectly flat on a desk can be detected such that users will not
tends to occur when the user first grasps the touch -sensitive unintentionally trigger Tablet Handoff if they happen to
computing device , after a few seconds elapse it is assumed touch or hold their device with both hands .
that any hand contact with the screen was intentional. This 40 5 .2 History Prior to the Correlation with Bump on Sen
therefore illustrates how the detection scheme of implemen - sors .
tations of the pen and computing device sensor correlation Some implementations of the pen and computing device
technique block unintentional touch , while also allowing sensor correlation technique analyze past data of sensors
intentional touches to get through , unlike simple thumb (obtained , for example , by constantly recording sensor data )
blocking heuristics which ignore any hand contact near the 45 on the touch - sensitive pen and / or the touch - sensitive com
edge of the screen in certain applications. puting device . This data can be correlated , for example, with
5 . 1. 10 .3 Handoff : Passing the Pen or the Touch -Sensitive sensor data taken at the time of a bump that is associated
Computing Device to Another User with touch down of the pen to confirm or reinforce that the
In implementations of the pen and computing device touch was correlated with the movement of the pen . Imple
sensor correlation technique passing a touch -sensitive com - 50 mentations of the technique that use past correlation data in
puting device ( e .g ., a tablet) or a pen to another user is used addition to present correlation data can be more robust than
as a way to offer an alternative , more physical semantic of those that do not. For example, theremay be an acceleration /
sharing content with another user. Studies of passing pre - declaration pattern associated with the pen a little bit before
hension and other user observations indicate that users go sensor data associated with a person putting down his or her
through a sequence of specific motions. First they extend the 55 hand to write that is recognized at the time of the bump
object while holding it approximately level to offer it to the detection to confirm that the contact is a palm and not by
other person , then they maintain their grip until the other chance a touch down with the non - dominant hand while the
person has firmly grasped the object. The person passing the pen is being moved at the same time.
device then lets go, and the other person brings it in closer 6 .0 Exemplary System Hardware :
to their body , often while also orienting the screen to his or 60 In a prototype implementation , to support the range of
her preferred viewing angle. All or parts of this sequence can context- sensing techniques envisioned , custom hardware
be sensed to detect passing prehension interactions . was designed to augment the pen and touch - sensitive com
For example, FIG . 16 shows a primary user 1602 passing puting device with inertial sensors and capacitive grip
a pen 1604 to a secondary user 1606 . The grip of the primary sensing, as well as custom software/ firmware to handle
user on the touch -sensitive pen is sensed ( e.g ., a baton grip ). 65 simultaneous pen + touch events from the touch screen . The
At the same time, or almost the same time, the grip of the description of the following exemplary prototype hardware
secondary user on the touch -sensitive pen is also sensed . The and software / firmware is not meant to be limiting but is
US 10 , 168 ,827 B2
27 28
provided to show how implementations discussed herein 6 .5 Inertial Sensor Fusion
could be implemented . Those with ordinary skill in the art Sometechnique implementations combine accelerometer,
will realize that many other implementations are possible . gyro , and magnetometer sensor inputs using a direction
6 . 1. Pen Hardware Design cosine matrix algorithm . This produces stable yaw , pitch ,
In one exemplary working implementation , a flexible 5 and roll values in an east-north -up Earth coordinate frame.
capacitive grid consisting of 7x30 sensing elements covers This is used to derive the pen orientation in a consistent
the entire barrel of a pen , which was wrapped in heat- shrink reference frame relative to the tablet.
tubing to protect the sensors and to provide a smooth and 6 .6 Pen Grip Classification
The prototype implementation recognizes four distinct
easy -to - grip cylindrical surface for sensing. The interior of 10 grips
the pen consists of a 3D -printed case that holds a miniature is not: held Writing , Tuck , Palm , and No Grip ( for when the pen
). Per observed behavior B4 (Grip vs . Pose ), the
electromagnetic pen , a 4 AAAA battery , and custom cir grip recognition considers the pattern of hand contact (ca
cuitry . For inertial sensing a gyroscope as well as an pacitive grip sensing ) as well as the pose (orientation ) of the
accelerometer /magnetometer module was used . For capaci stylus. The prototype implementation of the system pro
tive sensing a touch controller was employed . The pen 15 cesses the incoming data to extract salient features and then
streams all data to a host computer using a transceiver trains a multi - class classifier to extract the pen grips. Tech
operating at 2 Mbps. A microcontroller runs the firmware. nique implementations perform a multi -class classification
In this particular implementation , all inertial sensor data of the pen grip patterns by using a set of one- vs-all learners ,
is streamed off the pen at 130 Hz, and the 7x30 capacitance where each learner is a Support Vector Machine (SVM )
map at 30 Hz. The resulting pen is 19 cm long with a 13 mm 20 classifier. The result is a probability distribution over all four
external diameter. grips.
6 .2 Touch - Sensitive Computing Device / Tablet Hardware Technique implementations select features for grip clas
Design sification that take into account unique considerations of the
The touch -sensitive computing device in this exemplary pen form -factor. In particular, since the pen is symmetrical
prototype system is a tablet computer. The tablet case covers 25 (cylindrical) along the axis of the pen barrel, the sensed grips
the entire back surface and sides of the tablet . The rigid case are agnostic to the roll angle of the pen . Similarly , whether
is constructed from printed circuit boards consisting of the user grips the pen lower or higher on the barrel, or the
44x26 capacitive sensing elements. There is a small insen size of his hand , should not affect the grip classification .
Thus some implementations of the system compute a nor
sitive area ( in the middle of the case on the back side ) wherehe 30 malized
the integrated circuits are mounted . The case includes the 30 roll angleimage invariant with respect to both grip height and
. From the raw 7x30 capacitance map , the system
same sensor components as the pen , except there are four fits the non - zero capacitance values into a 7x10 normalized
touch controllers for different parts of the capacitive grid . image . The capacitance map is shifted in the y -dimension so
The tablet sensor data is streamed via USB , with the tablet 's that the first row of lit (non - zero capacitance ) pixels corre
inertial sensors sampled at 100 Hz and the tablet's 44x26 20 3535 sponds to the bottom of the normalized image , and then
capacitance map sampled at 25 Hz. scale the non -zero capacitance pixels to fit in the 10 rows of
6 .3 Simultaneous Pen and Touch from the Touch Screen the normalized image. The features employed for grip
Multi-touch and pen events were handled by intercepting classification therefore include the pen yaw and pitch angles ,
them directly from a Human Interface Device (HID ) con - the normalized grip image , as well as the normalized grip
troller. Using this approach a Samsung Corporation Series 7 40 histogram in the x and y dimensions. Technique implemen
Slate was used for the prototype that can report up to 8 touch tations also include features for the number of lit (non -zero )
contacts simultaneously with pen input. pixels, and the pixel sum of all 7x30 capacitance values
6 .4 Software/Firmware Design from the raw capacitance map .
Some technique implementations aggregate and time 6 .6 Grip Training Dataset Collection
stamp the pen and tablet sensor data on an external PC for 45 In one implementation , nine right -handed participants ( 4
processing, and then transmit relevant input events to the female ), all of whom had prior exposure to pen and tablet
tablet for presentation in a user interface . Furthermore , some use , were used to generate a grip training dataset . Users were
technique implementations compute grip recognition on the led through a script illustrating specific grips and actions to
external PC . As such , one implementation consists of a perform in each grip . These included stowing the pen while
distributed system with four cooperating components : the 50 using touch (per observed behavior B1) from both the Tuck
touch - sensitive pen , the tablet case , the tablet itself, and the and Palm grips (behavior B2). Different sequences of tasks
external PC . It should be noted , however, that aggregating were also included to capture various common transitions
and time-stamping the sensor data , as well as grip recogni - between grips ( behavior B3). Users were led through the full
tion , can be performed on the tablet computer thereby range of supination for each grip (behavior B4) which
obviating the need for the standalone personal computer 55 included transitions between Writing and the single - finger
( PC ) . The case can be directly integrated into the touch - and two - finger extension grips (behavior B5 ) , with articu
sensitive computing device . In some implementations, time- lation of direct-manipulation gestures such as tapping, drag
stamped data from all the distributed event streams are ging , and pinching . However, no particular tripod grip to use
queued up in synchronized buffers. Some technique imple - was specified , but rather users were allowed to hold the pen
mentations then handle events from these buffers up until the 60 naturally so that the dataset would capture cross -user varia
latesttime-stamp for which all events are available. In other tions in Writing grips (per behavior B6 ). The data collection
implementations , some events may be dispatched in real- lasted approximately 15 minutes per user, with a total of
time but then subsequently modified if future event samples 1200 samples for each user,per grip , yielding a total training
arrive from other sources that would alter their interpreta - dataset of 1200x3 gripsx9 users = 32400 samples .
tion . This can be achieved by delaying feedback , displaying 65 6 .6 Pen Grip Recognition Accuracy
tentative feedback , or undoing speculative actions in end A 10 - fold cross - validation using the collected grip train
user applications. ing dataset yielded an overall accuracy of 88 % for a user
US 10 , 168 ,827 B2
29 30
independent model. A separate check with nine additional interface 1830 for receiving communications from sensor
right-handed users was conducted , none of whom had con pen device 1835 . The computing device 1800 may also
tributed data to the training dataset . This yielded user - include one or more conventional computer input devices
independent grip recognition accuracy of 93 % for the Writ- 1840 or combinations of such devices (e .g ., pointing
ing grip , 93 % for the Tuck grip , and 77 % for the Palm grip . 5 devices, keyboards, audio input devices, voice or speech
The relatively low recognition rate for the Palm grip based input and control devices , video input devices, haptic
appeared to stem from several users ' tendency to hold the input devices , touch input devices , devices for receiving
pen very lightly in this grip , resulting in a somewhat w ired or wireless data transmissions, etc . ). The computing
inconsistent pattern of contact sensed by the capacitive grip device 1800 may also include other optional components ,
array . However, the system was still able to distinguish 10 such as , for example , one or more conventional computer
Writing vs. non -writing grips (i.e . Tuck or Palm ) with 97 % output devices 1850 ( e . g., display device ( s ) 1855 , audio
accuracy . Since most interaction techniques do not depend output devices, video output devices, devices for transmit
on any distinction between the Tuck versus Palm grips, this ting wired or wireless data transmissions, etc . ). Note that
user -independent grip model, which works well enough typical communications interfaces 1830 , input devices 1840 ,
even without collecting training data for newly -encountered 15 output devices 1850 , and storage devices 1860 for general
users, was used .
us purpose computers are well known to those skilled in the art,
7.0 Exemplary Operating Environments : and will not be described in detail herein .
Implementations of the pen and computing device sensor The computing device 1800 may also include a variety of
correlation technique described herein are operational computer readable media . Computer readable media can be
within numerous types of general purpose or special purpose 20 any available media that can be accessed by computer
computing system environments or configurations. FIG . 18 device 1800 via storage devices 1860 and includes both
illustrates a simplified example of a general- purpose com volatile and nonvolatile media that is either removable 1870
puter system in combination with a pen or pen enhanced and /or non -removable 1880 , for storage of information such
with various sensors with which various implementations as computer- readable or computer - executable instructions ,
and elements of the pen and computing device sensor 25 data structures, program modules, or other data . By way of
correlation technique , as described herein , may be imple example , and not limitation , computer readable media may
mented . It should be noted that any boxes that are repre - comprise computer storage media and communication
sented by broken or dashed lines in FIG . 18 represent media . Computer storage media refers to tangible computer
alternate implementations of the simplified computing or machine readable media or storage devices such as
device and sensor pen , and that any or all of these alternate 30 DVD 's , CD ' s, floppy disks , tape drives , hard drives , optical
implementations , as described below , may be used in com - drives, solid state memory devices, RAM , ROM , EEPROM ,
bination with other alternate implementations that are flash memory or other memory technology , magnetic cas
described throughout this document . settes, magnetic tapes , magnetic disk storage , or other mag
For example , FIG . 18 shows a general system diagram netic storage devices, or any other device which can be used
showing a simplified touch -sensitive computing device 35 to store the desired information and which can be accessed
1800 . In general, such touch -sensitive computing devices by one or more computing devices.
1800 have one or more touch -sensitive surfaces 1805 or Storage of information such as computer - readable or
regions ( e . g ., touch screen , touch sensitive bezel or case , computer - executable instructions, data structures , program
sensors for detection of hover- type inputs, optical touch modules , etc ., can also be accomplished by using any of a
sensors , etc . ). Examples of touch - sensitive computing 40 variety of the aforementioned communication media to
devices 1800 include, but are not limited to , touch -sensitive encode one ormore modulated data signals or carrier waves ,
display devices connected to a computing device , touch - or other transport mechanisms or communications protocols,
sensitive phone devices , touch - sensitive media players , and includes any wired or wireless information delivery
touch -sensitive e-readers, notebooks , netbooks, booklets mechanism . Note that the terms “ modulated data signal” or
( dual -screen ), tablet type computers , or any other device 45 “ carrier wave ” generally refer to a signal that has one or
having one or more touch -sensitive surfaces or inputmodali - more of its characteristics set or changed in such a manner
ties. as to encode information in the signal. For example , com
To allow a device to implement the pen and computing munication media includes wired media such as a wired
device sensor correlation technique, the computing device network or direct -wired connection carrying one or more
1800 should have a sufficient computational capability and 50 modulated data signals, and wireless media such as acoustic ,
system memory to enable basic computational operations. In RF, infrared , laser, and other wireless media for transmitting
addition , the computing device 1800 may include one or and /or receiving one or more modulated data signals or
more sensors 1810, including, but not limited to , acceler - carrier waves . Combinations of the any of the above should
ometers , gyroscopes, magnetometer, finger print detectors, also be included within the scope of communication media .
cameras including depth cameras, capacitive sensors , prox - 55 Retention of information such as computer- readable or
imity sensors, microphones, multi-spectral sensors, etc . As computer -executable instructions, data structures , program
illustrated by FIG . 18 , the computational capability is gen - modules , etc ., can also be accomplished by using any of a
erally illustrated by one or more processing unit ( s ) 1825 , and variety of the aforementioned communication media to
may also include one or more GPUs 1815 , either or both in encode one or more modulated data signals or carrier waves ,
communication with system memory 1820 . Note that the 60 or other transport mechanisms or communications protocols ,
processing unit(s ) 1825 of the computing device 1800 of and includes any wired or wireless information delivery
may be specialized microprocessors , such as a DSP, a VLIW , mechanism . Note that the terms “ modulated data signal” or
or other micro -controller, or can be conventional CPUs " carrier wave” generally refer to a signal that has one or
having one or more processing cores , including specialized more of its characteristics set or changed in such a manner
GPU -based cores in a multi- core CPU . 65 as to encode information in the signal. For example , com
In addition , the computing device 1800 may also include munication media includes wired media such as a wired
other components, such as, for example , a communications network or direct-wired connection carrying one or more
US 10 , 168 ,827 B2
31 32
modulated data signals , and wireless media such as acoustic , computing device sensor correlation technique . For
RF, infrared , laser, and other wireless media for transmitting example , various devices used to enable some of the many
and/ or receiving one or more modulated data signals or implementations of the pen and computing device sensor
carrier waves. Combinations of the any of the above should correlation technique described herein include pens , point
also be included within the scope of communication media . 5 ers, pen type input devices. However, the functionality
Further , software, programs, and /or computer program described herein may be implemented in any desired form
products embodying the some or all of the various imple factor, e . g ., phone , wand , staff, ball racquet, toy sword , etc .,
mentations of the pen and computing device sensor corre for use with various gaming devices, gaming consoles , or
lation technique described herein , or portions thereof, may other computing devices. Further , the sensor pens described
be stored , received , transmitted , and/ or read from any 10 herein are adapted to incorporate a power supply and various
desired combination of computer or machine readable media combinations of sensors including, but not limited to inertial
or storage devices and communication media in the form of sensors , cameras including depth cameras , accelerometers,
computer executable instructions and /or other data struc pressure sensors , grip sensors , near- field communication
tures .
Finally, the pen and computing device sensor correlation 15 sensors, RFID tags and /or sensors , temperature sensors,
technique described herein may be further described in the microphones, magnetometers , capacitive sensors , gyro
general context of computer -executable instructions, such as scopes, etc ., in combination with various wireless commu
program modules , being executed by a computing device . nications capabilities for interfacing with various computing
Generally , program modules include routines , programs, devices. Note that any or all of these sensors may be
objects , components , data structures, etc ., that perform par - 20 multi - axis or multi -position sensors (e . g ., 3 -axis accelerom
ticular tasks or implement particular abstract data types . The eters , gyroscopes , and magnetometers ). In addition , in vari
implementations described herein may also be practiced in ous implementations, the sensor pens described herein have
distributed computing environments where tasks are per been further adapted to incorporate memory and /or comput
formed by one or more remote processing devices, or within ing capabilities that allow the sensor pens to act in combi
a cloud of one or more devices, that are linked through one 25 nation or cooperation with other computing devices , other
or more communications networks . In a distributed comput- sensor pens, or even as a standalone computing device .
ing environment, program modules may be located in both While the pen and computing device sensor correlation
local and remote computer storage media including media
storage devices. Still further , the aforementioned instruc technique senses actual touch to a sensor pen and a touch
tions may be implemented , in part or in whole, as hardware 30 sensitive computing device , it may also be employed with
logic circuits , which may or may not include a processor. virtual touch inputs. Virtual touch inputs relative to pro
The sensor pen device 1835 illustrated by FIG . 18 shows jected displays, electronic whiteboards, or other surfaces or
a simplified version of a pen or pen augmented with pen objects are treated by the pen and computing device sensor
sensors 1845 , logic 1865, a power source 1875 (e .g ., a correlation technique in the same manner as actual touch
battery ), and basic I/O capabilities 1885 . As discussed 35 inputs on a touch -sensitive surface . Such virtual touch inputs
above , examples of pen sensors 1845 for use with the sensor are detected using conventional techniques such as, for
pen device 1835 include, but are not limited to , inertial example , using cameras or other imaging technologies to
sensors , cameras including depth cameras, proximity sen track user finger movement relative to a projected image ,
sors, finger print sensors , galvanic skin response sensors ,
accelerometers , pressure sensors , grip sensors, near- field 40 relative to text on an electronic whiteboard , relative to
physical objects, etc .
communication sensors , RFID tags and/ or sensors , tempera
ture sensors , microphones , magnetometers , capacitive sen In addition, it should be understood that the pen and
sors , gyroscopes, etc . computing device sensor correlation technique is operable
In general, the logic 1865 of the sensor pen device 1835 with a wide variety of touch and flex - sensitive materials for
is similar to the computational capabilities of computing 45 determining or sensing touch or pressure. For example, one
device 1800 , but is generally less powerful in terms of touch -sensing technology adapted for use by the pen and
computational speed ,memory, etc . However , the sensor pen computing device sensor correlation technique determines
device 1835 can be constructed with sufficient logic 1865 touch or pressure by evaluating a light source relative to
such that it can be considered a standalone capable compu some definite deformation of a touched surface to sense
tational device . 50
contact. Also , it should be noted that sensor pens, as dis
The power source 1875 of the sensor pen device 1835 is cussed herein , may include multiple types of touch and /or
implemented in various form factors, including, but not pressure sensing substrates. For example , sensor pens may
limited to , replaceable batteries, rechargeable batteries,
capacitive energy storage devices, fuel cells , etc. Finally, the be both touch -sensitive and/or pressure sensitive using any
I/ O 1885 of the sensor pen device 1835 provides conven - 55 combination of sensors , such as, for example , capacitive
tional wired or wireless communications capabilities that sensors, pressure sensors , flex - or deformation -based sen
allow the sensor pen device to communicate sensor data sors , depth sensors , etc .
and /or other information to the computing device 1800 . It is intended that the scope of the disclosure be limited
The foregoing description of the pen and computing not by this detailed description , but rather by the claims
device sensor correlation technique has been presented for 60 appended hereto . Although the subject matter has been
the purposes of illustration and description . It is not intended described in language specific to structural features and /or
to be exhaustive or to limit the claimed subjectmatter to the methodological acts, it is to be understood that the subject
precise form disclosed . Many modifications and variations
are possible in light of the above teaching . Further, it should matter defined in the appended claims is not necessarily
be noted that any or all of the aforementioned alternate 65 limited to the specific features or acts described above.
implementationsmay be used in any combination desired to Rather, the specific features and acts described above are
form additional hybrid implementations of the pen and disclosed as example forms of implementing the claims.
US 10 , 168 ,827 B2
33 34
What is claimed is: detecting a contact from the user 's non - preferred hand on
1. A computer-implemented process for initiating user a display of the touch -sensitive computing device ; and
interface actions, comprising using a computer for: displaying context- specific tools in response to the con
determining a user' s preferred hand and non -preferred tact.
hand from sensor inputs , wherein the user 's preferred 5 10 . The computer- implemented process of claim 7 , fur
hand is determined by : ther comprising:
sensing sensor signals representing a bump in a motion of determining that the touch - sensitive pen is held in a grip
a touch -sensitive pen when a contact is made with a indicating that the touch - sensitive pen is in a position to
touch -sensitive computing device ; and write in the user ' s preferred hand ;
correlating sensor signals representing the bump in the 10 detecting a contact from the user 's non - preferred hand on
touch -sensitive pen ’s motion with signals representing a display of the touch -sensitive computing device; and
a motion of the touch -sensitive computing device to displaying context - specific tools in response to the con
determine the user ' s preferred hand ; and tact .
initiating a user interface action based on the determina 11 . The computer- implemented process ofclaim 7, further
tion of the user ' s preferred hand and non -preferred 15 comprising :
hand . determining that the touch - sensitive pen is held in a grip
2 . The computer- implemented process of claim 1, further indicating that the touch -sensitive pen is in a position to
comprising determining the user 's preferred hand by: write in the user 's preferred hand ;
sensing the sensor signals representing the bump in a detecting a multi- finger touch gesture from the user ' s
motion of the touch -sensitive pen when the contact is 20 non -preferred hand on the display of the touch -sensi
made with the touch -sensitive computing device ; and tive computing device; and
correlating the sensor signals representing the bump in the displaying context-specific tools in response to the multi
touch -sensitive pen ’s motion with the sensor signals finger touch gesture.
representing the motion of the touch - sensitive comput- 12 . The computer -implemented process of claim 1, fur
ing device to determine that the hand holding the 25 ther comprising :
touch -sensitive pen is the preferred hand . determining which hand a user is holding the touch
3. The computer- implemented process of claim 1 further sensitive computing device with ; and
comprising: displaying a menu in a location on the screen of the
correlating the sensor signals representing the touch touch -sensitive computing device that can be activated
sensitive computing device motion with touch patterns 30 by the user with the thumb of the hand holding the
on the touch -sensitive computing device ; and touch -sensitive computing device .
using the correlated motion and touch patterns on the 13 . The computer-implemented process of claim 1,
touch -sensitive computing device to suppress acciden - wherein one or more of the signals represent a finger print
tal screen content rotation on a display of the touch of a user which is used to identify the user.
sensitive computing device . 35 14 . The computer -implemented process of claim 1, fur
4 . The computer- implemented process of claim 1 wherein ther comprising using historical signal data to improve the
the user interface action is displaying a menu on a display of correlation of the signals .
the touch -sensitive computing device . 15 . A computer -implemented process for initiating user
5 . The computer - implemented process of claim 4 wherein interface actions, comprising using a computer for:
the menu displayed is dependent on whether the user's 40 determining a user's preferred hand and non -preferred
preferred hand or non -preferred hand is touching the touch hand from sensor inputs , wherein the user 's preferred
sensitive computing device . hand is determined by :
6 . The computer -implemented process of claim 1, further sensing sensor signals representing a bump in a motion of
comprising: a first touch -sensitive device when a contact is made
determining whether the user is making a gesture with the 45 with a second touch - sensitive device ; and
user ' s preferred hand or non -preferred hand . correlating sensor signals representing the bump in the
7 . The computer -implemented process of claim 1 , further first touch -sensitive device ' s motion with signals rep
comprising: resenting a motion of the second touch - sensitive device
determining grip patterns on the touch -sensitive pen and to determine the user' s preferred hand ; and
grip patterns on the touch -sensitive computing device ; 50 using the determination of the user 's preferred hand and
using the grip patterns, determining whether the user is non -preferred hand in an application .
making a gesture with a bare hand or whether the user 16 . The computer- implemented process of claim 15
is making a gesture with a hand holding the touch - wherein the determination of the user preferred hand and
sensitive pen . non - preferred hand is used to initiate a context- appropriate
8 . The computer- implemented process of claim 7, further 55 command using the application .
comprising: 17 . The computer -implemented process of claim 16 ,
determining that the touch - sensitive pen is held in a wherein the context-appropriate action is enabling context
stowed position in the user 's preferred hand ; appropriate functionality on the first touch -sensitive com
detecting a contact from the user 's preferred hand on a puting device or the second touch - sensitive computing
display of the touch -sensitive computing device ; and 60 device .
displaying context-specific tools in response to the con - 18 . The computer-implemented process of claim 16 ,
tact. wherein the determination of the user 's preferred hand or
9 . The computer -implemented process of claim 7, further non -preferred hand is used to detect unintentional contact
comprising: with either the first touch - sensitive computing device or the
determining that the touch -sensitive pen is held in the 65 second touch - sensitive computing device.
user ' s preferred hand in a grip indicating that the 19 . The computer- implemented process of claim 16 ,
touch -sensitive pen is not in a position to write ; wherein using the determination of the user 's preferred hand
US 10 , 168, 827 B2
35 36
and non -preferred hand is used in determining a user 's grip
on either the first touch -sensitive computing device or the
second touch - sensitive computing device .

You might also like