From 938786061843a6a32329ed111ba3eaa1cd296e6b Mon Sep 17 00:00:00 2001 From: Mike Gower Date: Tue, 30 Apr 2019 11:18:06 -0700 Subject: [PATCH 01/10] Update pointer-gestures.html --- understanding/21/pointer-gestures.html | 14 ++++++-------- 1 file changed, 6 insertions(+), 8 deletions(-) diff --git a/understanding/21/pointer-gestures.html b/understanding/21/pointer-gestures.html index a2de30c169..762f006997 100644 --- a/understanding/21/pointer-gestures.html +++ b/understanding/21/pointer-gestures.html @@ -10,16 +10,16 @@

Pointer Gestures
Understanding SC 2.5.1

Intent of this Success Criterion

-

The intent of this Success Criterion is to ensure that content can be operated using simple inputs on a wide range of pointing devices. This is important for users who cannot perform complex gestures, such as multipoint or path-based gestures, in a precise manner, either because they may lack the accuracy necessary to carry them out or because they use a pointing method that lacks the capability or accuracy.

-

A path-based gesture involves a user interaction where the gesture's success is dependent on the path of the user's pointer movement and not just the endpoints. Examples include swiping (which relies on the direction of movement) the dragging of a slider thumb, or gestures which trace a prescribed path, as in the drawing of a specific shape. Such paths may be drawn with a finger or stylus on a touchscreen, graphics tablet, or trackpad, or with a mouse, joystick, or similar pointer device.

-

A user may find it difficult or impossible to accomplish these gestures if they have impaired fine motor control, or if they use a specialized or adapted input device such as a head pointer, eye-gaze system, or speech-controlled mouse emulation.

-

Note that free-form drag and drop actions are not considered path-based gestures for the purposes of this Success Criterion.

-

Examples of multipoint gestures include a two-finger pinch zoom, a split tap where one finger rests on the screen and a second finger taps, or a two- or three-finger tap or swipe. A user may find it difficult or impossible to accomplish these if they type and point with a single finger or stick, in addition to any of the causes listed above.

+

The intent of this Success Criterion is to ensure that content can be operated using simple inputs on a wide range of pointing devices. This is important for users who cannot perform complex gestures in a precise manner; users may lack the precision or ability to carry out the gestures or they may use a pointing method that lacks the capability or accuracy to perform multipoint or path-based gestures.

+

A path-based gesture involves a user interaction where the gesture's success is dependent on the path of the user's pointer movement and not just the endpoints. Examples include swiping (which relies on the direction of movement) and gestures which trace a prescribed path, as in the drawing of a specific shape. Such paths may be drawn with a finger or stylus on a touchscreen, graphics tablet, or trackpad, or with a mouse, joystick, or similar pointer device.

+

A user may find it difficult or impossible to accomplish these gestures if they have impaired fine motor control, or if they use a specialized or adapted input device such as a head pointer, eye-gaze system, or speech-controlled mouse emulation. Note that most dragging actions including drag and drop are not considered path-based gestures for the purposes of this Success Criterion. This is because once an object is selected, it can be dragged in a wayward manner to its destination (endpoint), and need not follow a prescribed path.

+

Examples of multipoint gestures include a two-finger pinch zoom, a split tap where one finger rests on the screen and a second finger taps, or a two- or three-finger tap or swipe. A user may find it difficult or impossible to accomplish these if they type and point with a single finger or stick, in addition to any of the causes listed above.

Authors must ensure that their content can be operated without such complex gestures. When they implement multipoint or path-based gestures, they must ensure that the functionality can also be operated via single-point activation. Examples of single-point activation on a touchscreen or touchpad include taps, double taps, and long presses. Examples for a mouse, trackpad, head-pointer, or similar device include single clicks, click-and-hold and double clicks.

-

This Success Criterion applies to author-created gestures, as opposed to gestures defined on the level of operating system or user agent. An example for gestures provided on the operating system level would be swiping down to see system notifications, and gestures for built-in assistive technologies (AT) to focus or activate content, or to call up AT menus. Examples of user-agent-implemented gestures would be horizontal swiping implemented by browsers for navigating within the page history, or vertical swiping to scroll page content.

+

This Success Criterion applies to author-created gestures, as opposed to gestures defined on the level of operating system or user agent. Examples for gestures provided at the operating system level would be swiping down to see system notifications, and gestures for built-in assistive technologies (AT) to focus or activate content, or to call up AT menus. Examples of user-agent-implemented gestures would be horizontal swiping implemented by browsers for navigating within the page history, or vertical swiping to scroll page content.

While some operating systems may provide ways to define "macros" to replace complex gestures, content authors cannot rely on such a capability because it is not pervasive on all touch-enabled platforms. Moreover, this may work for standard gestures that a user can predefine, but may not work for other author-defined gestures.

This Success Criterion does not require all functionality to be available through pointing devices, but that which is must be available to users who use the pointing device but cannot perform complex gestures. While content authors may provide keyboard commands or other non-pointer mechanisms that perform actions equivalent to complex gestures (see Success Criterion 2.1.1 Keyboard), this is not sufficient to conform to this Success Criterion. That is because some users rely entirely on pointing devices, or find simple pointer inputs much easier than alternatives. For example, a user relying on a head-pointer would find clicking a control to be much more convenient than activating an on-screen keyboard to emulate a keyboard shortcut, and a person who has difficulty memorizing a series of keys (or gestures) may find it much easier to simply click on a labeled control. Therefore, if one or more pointer-based mechanisms are supported, then their benefits should be afforded to users through simple, single-point actions alone.

An exception is made for functionality that is inherently and necessarily based on complex paths or multipoint gestures. For example, entering one's signature may be inherently path-based (although acknowledging something or confirming one's identity need not be).

+

Note that although gestures that involve dragging are not typically considered in scope for this SC, such gestures require a higher level of fine motor control. Authors are encouraged to provide non-dragging methods for interacting with the same controls. For instance, although a slider control can be operated by dragging the 'thumb' control, a single tap or click anywhere on the slider groove can move the thumb control to the chosen position. Likewise, buttons on either side of a slider can increment and decrement the selected value and update the thumb position.

Benefits

@@ -34,8 +34,6 @@

Examples

  • A web site includes a map view that supports both the pinch gesture to zoom into the map content and drag gestures to move the visible area. User interface controls offer the operation via [+] and [-] buttons to zoom in and out, and arrow buttons to pan stepwise in all directions.

  • A news site has a horizontal content slider with hidden news teasers that can moved into the viewport via horizontal swiping. It also offers forward and backward arrow buttons for single-point activation to navigate to adjacent slider content.

  • -
  • A mortgage lending site has a slider control for setting the amount of credit required. The slider can be operated by dragging the thumb, but also by a single tap or click anywhere on the slider groove in order to set the thumb to the chosen position.

  • -
  • A slider control can be operated by dragging the thumb. Buttons on both sides of the slider increment and decrement the selected value and update the thumb position.

  • A kanban widget with several vertical areas representing states in a defined process allows the user to right- or left-swipe elements to move them to an adjacent silo. The user can also accomplish this by selecting the element with a single tap or click, and then activating an arrow button to move the selected element.

From f977087bada258626e0350471a08bc0dcde65bd6 Mon Sep 17 00:00:00 2001 From: Alastair Campbell Date: Wed, 15 May 2019 17:54:48 +0100 Subject: [PATCH 02/10] Updates from slider discussion --- understanding/21/pointer-gestures.html | 8 +++++--- 1 file changed, 5 insertions(+), 3 deletions(-) diff --git a/understanding/21/pointer-gestures.html b/understanding/21/pointer-gestures.html index 762f006997..66a032606c 100644 --- a/understanding/21/pointer-gestures.html +++ b/understanding/21/pointer-gestures.html @@ -11,7 +11,8 @@

Pointer Gestures
Understanding SC 2.5.1

Intent of this Success Criterion

The intent of this Success Criterion is to ensure that content can be operated using simple inputs on a wide range of pointing devices. This is important for users who cannot perform complex gestures in a precise manner; users may lack the precision or ability to carry out the gestures or they may use a pointing method that lacks the capability or accuracy to perform multipoint or path-based gestures.

-

A path-based gesture involves a user interaction where the gesture's success is dependent on the path of the user's pointer movement and not just the endpoints. Examples include swiping (which relies on the direction of movement) and gestures which trace a prescribed path, as in the drawing of a specific shape. Such paths may be drawn with a finger or stylus on a touchscreen, graphics tablet, or trackpad, or with a mouse, joystick, or similar pointer device.

+

A path-based gesture involves an interaction where the user engages a pointer with the display (down event), carries out a directional movement in a pre-determined direction before disengaging the pointer (up event). The direction, speed, and also the delta between start and end point may each be evaluated to determine what function is triggered.

+

Examples include swiping (which relies on the direction of movement) and gestures which trace a prescribed path, as in the drawing of a specific shape. Such paths may be drawn with a finger or stylus on a touchscreen, graphics tablet, or trackpad, or with a mouse, joystick, or similar pointer device.

A user may find it difficult or impossible to accomplish these gestures if they have impaired fine motor control, or if they use a specialized or adapted input device such as a head pointer, eye-gaze system, or speech-controlled mouse emulation. Note that most dragging actions including drag and drop are not considered path-based gestures for the purposes of this Success Criterion. This is because once an object is selected, it can be dragged in a wayward manner to its destination (endpoint), and need not follow a prescribed path.

Examples of multipoint gestures include a two-finger pinch zoom, a split tap where one finger rests on the screen and a second finger taps, or a two- or three-finger tap or swipe. A user may find it difficult or impossible to accomplish these if they type and point with a single finger or stick, in addition to any of the causes listed above.

Authors must ensure that their content can be operated without such complex gestures. When they implement multipoint or path-based gestures, they must ensure that the functionality can also be operated via single-point activation. Examples of single-point activation on a touchscreen or touchpad include taps, double taps, and long presses. Examples for a mouse, trackpad, head-pointer, or similar device include single clicks, click-and-hold and double clicks.

@@ -19,7 +20,7 @@

Intent of this Success Criterion

While some operating systems may provide ways to define "macros" to replace complex gestures, content authors cannot rely on such a capability because it is not pervasive on all touch-enabled platforms. Moreover, this may work for standard gestures that a user can predefine, but may not work for other author-defined gestures.

This Success Criterion does not require all functionality to be available through pointing devices, but that which is must be available to users who use the pointing device but cannot perform complex gestures. While content authors may provide keyboard commands or other non-pointer mechanisms that perform actions equivalent to complex gestures (see Success Criterion 2.1.1 Keyboard), this is not sufficient to conform to this Success Criterion. That is because some users rely entirely on pointing devices, or find simple pointer inputs much easier than alternatives. For example, a user relying on a head-pointer would find clicking a control to be much more convenient than activating an on-screen keyboard to emulate a keyboard shortcut, and a person who has difficulty memorizing a series of keys (or gestures) may find it much easier to simply click on a labeled control. Therefore, if one or more pointer-based mechanisms are supported, then their benefits should be afforded to users through simple, single-point actions alone.

An exception is made for functionality that is inherently and necessarily based on complex paths or multipoint gestures. For example, entering one's signature may be inherently path-based (although acknowledging something or confirming one's identity need not be).

-

Note that although gestures that involve dragging are not typically considered in scope for this SC, such gestures require a higher level of fine motor control. Authors are encouraged to provide non-dragging methods for interacting with the same controls. For instance, although a slider control can be operated by dragging the 'thumb' control, a single tap or click anywhere on the slider groove can move the thumb control to the chosen position. Likewise, buttons on either side of a slider can increment and decrement the selected value and update the thumb position.

+

Gestures that involve dragging in any direction are not in scope for this SC, however, such gestures do require fine motor control. Authors are encouraged to provide non-dragging methods, for instance, a drag and drop operation could also be achieved by selecting an item (with a tap or keyboard interaction) and then selecting its destination as a second step.

Benefits

@@ -34,7 +35,8 @@

Examples

  • A web site includes a map view that supports both the pinch gesture to zoom into the map content and drag gestures to move the visible area. User interface controls offer the operation via [+] and [-] buttons to zoom in and out, and arrow buttons to pan stepwise in all directions.

  • A news site has a horizontal content slider with hidden news teasers that can moved into the viewport via horizontal swiping. It also offers forward and backward arrow buttons for single-point activation to navigate to adjacent slider content.

  • -
  • A kanban widget with several vertical areas representing states in a defined process allows the user to right- or left-swipe elements to move them to an adjacent silo. The user can also accomplish this by selecting the element with a single tap or click, and then activating an arrow button to move the selected element.

  • +
  • A kanban widget with several vertical areas representing states in a defined process allows the user to right- or left-swipe elements to move them to an adjacent silo. The user can also accomplish this by selecting the element with a single tap or click, and then activating an arrow button to move the selected element.

  • +
  • A slider control restricts the movement to a strict left & right direction when operated by dragging the thumb. Buttons on both sides of the slider increment and decrement the selected value and update the thumb position.

From c0ae6f464ebac8fbb5eb4ba952ba7412ae18a1a4 Mon Sep 17 00:00:00 2001 From: Detlev Fischer Date: Thu, 16 May 2019 21:27:54 +0200 Subject: [PATCH 03/10] Another take on Pointer Gestures A re-write emphasizing the need for single-point activation, treating both directional swiping and dragging gestures as path-based gestures. Excluded are free-form drag-and-drop and free-form drawing (handwriting input, signature). I have rearranged examples and added a section with examples of alternatives for single-point activation to give implementers a clearer idea what the can do to meet this SC. --- understanding/21/pointer-gestures.html | 13 +++++++++---- 1 file changed, 9 insertions(+), 4 deletions(-) diff --git a/understanding/21/pointer-gestures.html b/understanding/21/pointer-gestures.html index 66a032606c..fbdf1015dd 100644 --- a/understanding/21/pointer-gestures.html +++ b/understanding/21/pointer-gestures.html @@ -10,10 +10,12 @@

Pointer Gestures
Understanding SC 2.5.1

Intent of this Success Criterion

-

The intent of this Success Criterion is to ensure that content can be operated using simple inputs on a wide range of pointing devices. This is important for users who cannot perform complex gestures in a precise manner; users may lack the precision or ability to carry out the gestures or they may use a pointing method that lacks the capability or accuracy to perform multipoint or path-based gestures.

+

The intent of this Success Criterion is to ensure that content that can be operated via path-based or multipoint gestures can also be operated using single-point activation with a pointer. This is important for users who cannot perform complex gestures in a precise manner, or not at all. Some pointer users may lack the precision or fine motor control to carry out multipoint or path-based gestures. Some users with strongly impaired motor control use a pointing method via an adapted input device such as a head pointer, an eye-gaze system, or speech-controlled mouse emulation. These users rely on single-point activation.

A path-based gesture involves an interaction where the user engages a pointer with the display (down event), carries out a directional movement in a pre-determined direction before disengaging the pointer (up event). The direction, speed, and also the delta between start and end point may each be evaluated to determine what function is triggered.

-

Examples include swiping (which relies on the direction of movement) and gestures which trace a prescribed path, as in the drawing of a specific shape. Such paths may be drawn with a finger or stylus on a touchscreen, graphics tablet, or trackpad, or with a mouse, joystick, or similar pointer device.

-

A user may find it difficult or impossible to accomplish these gestures if they have impaired fine motor control, or if they use a specialized or adapted input device such as a head pointer, eye-gaze system, or speech-controlled mouse emulation. Note that most dragging actions including drag and drop are not considered path-based gestures for the purposes of this Success Criterion. This is because once an object is selected, it can be dragged in a wayward manner to its destination (endpoint), and need not follow a prescribed path.

+

Examples of path-based gestures include the swiping and dragging of elements that move in a constrained manner along one axis, for example, content sliders, swipe-to-reveal controls, drawer-type menus, or control sliders where a thumb can be dragged along a groove to set a value. Path-based gestures may be executed with a finger or stylus on a touchscreen, graphics tablet, or trackpad, or with a mouse, joystick, or similar pointer device.

+

Alternatives for single-point activation include arrow buttons moving a content slider in a stepwise fashion; a menu button revealing a drawer-type menu; increment/decrement buttons or a numerical input field next to a control slider offering value input; or equivalent static controls made available after activating an element that implements a swipe-to-reveal gesture.

+ +

Note that unconstrained dragging in drag-and-drop interfaces, or the free-form drawing of shapes in handwriting recognition or in drawing a signature, are not considered path-based gestures for the purposes of this Success Criterion.

Examples of multipoint gestures include a two-finger pinch zoom, a split tap where one finger rests on the screen and a second finger taps, or a two- or three-finger tap or swipe. A user may find it difficult or impossible to accomplish these if they type and point with a single finger or stick, in addition to any of the causes listed above.

Authors must ensure that their content can be operated without such complex gestures. When they implement multipoint or path-based gestures, they must ensure that the functionality can also be operated via single-point activation. Examples of single-point activation on a touchscreen or touchpad include taps, double taps, and long presses. Examples for a mouse, trackpad, head-pointer, or similar device include single clicks, click-and-hold and double clicks.

This Success Criterion applies to author-created gestures, as opposed to gestures defined on the level of operating system or user agent. Examples for gestures provided at the operating system level would be swiping down to see system notifications, and gestures for built-in assistive technologies (AT) to focus or activate content, or to call up AT menus. Examples of user-agent-implemented gestures would be horizontal swiping implemented by browsers for navigating within the page history, or vertical swiping to scroll page content.

@@ -35,8 +37,11 @@

Examples

  • A web site includes a map view that supports both the pinch gesture to zoom into the map content and drag gestures to move the visible area. User interface controls offer the operation via [+] and [-] buttons to zoom in and out, and arrow buttons to pan stepwise in all directions.

  • A news site has a horizontal content slider with hidden news teasers that can moved into the viewport via horizontal swiping. It also offers forward and backward arrow buttons for single-point activation to navigate to adjacent slider content.

  • +
  • A color choice slider is operated by horizontally dragging a thumb on a groove. Buttons on both sides of the slider allow incrementing and decrementing the selected value and update the thumb position, long presses on the buttons offer continuous step-wise incrementation.

  • +
  • A mortgage lending site offers a slider for setting the amount of credit required which is operated by horizontally dragging a thumb on a groove. The thumb position can also be set by single tap or click anywhere on the groove.

  • +
  • An email application uses a short swipe gesture to quickly archive an email, and a longer swipe to delete the email. The user can also activate the email item and then use buttons offering the same functionality.

  • A kanban widget with several vertical areas representing states in a defined process allows the user to right- or left-swipe elements to move them to an adjacent silo. The user can also accomplish this by selecting the element with a single tap or click, and then activating an arrow button to move the selected element.

  • -
  • A slider control restricts the movement to a strict left & right direction when operated by dragging the thumb. Buttons on both sides of the slider increment and decrement the selected value and update the thumb position.

  • +
From 5dd66a78f4912486923dca0978a4f43e44499371 Mon Sep 17 00:00:00 2001 From: Detlev Fischer Date: Thu, 16 May 2019 21:35:20 +0200 Subject: [PATCH 04/10] Removing redundant sentence on dnd exception --- understanding/21/pointer-gestures.html | 4 +--- 1 file changed, 1 insertion(+), 3 deletions(-) diff --git a/understanding/21/pointer-gestures.html b/understanding/21/pointer-gestures.html index fbdf1015dd..f38584e018 100644 --- a/understanding/21/pointer-gestures.html +++ b/understanding/21/pointer-gestures.html @@ -14,14 +14,12 @@

Intent of this Success Criterion

A path-based gesture involves an interaction where the user engages a pointer with the display (down event), carries out a directional movement in a pre-determined direction before disengaging the pointer (up event). The direction, speed, and also the delta between start and end point may each be evaluated to determine what function is triggered.

Examples of path-based gestures include the swiping and dragging of elements that move in a constrained manner along one axis, for example, content sliders, swipe-to-reveal controls, drawer-type menus, or control sliders where a thumb can be dragged along a groove to set a value. Path-based gestures may be executed with a finger or stylus on a touchscreen, graphics tablet, or trackpad, or with a mouse, joystick, or similar pointer device.

Alternatives for single-point activation include arrow buttons moving a content slider in a stepwise fashion; a menu button revealing a drawer-type menu; increment/decrement buttons or a numerical input field next to a control slider offering value input; or equivalent static controls made available after activating an element that implements a swipe-to-reveal gesture.

- -

Note that unconstrained dragging in drag-and-drop interfaces, or the free-form drawing of shapes in handwriting recognition or in drawing a signature, are not considered path-based gestures for the purposes of this Success Criterion.

Examples of multipoint gestures include a two-finger pinch zoom, a split tap where one finger rests on the screen and a second finger taps, or a two- or three-finger tap or swipe. A user may find it difficult or impossible to accomplish these if they type and point with a single finger or stick, in addition to any of the causes listed above.

Authors must ensure that their content can be operated without such complex gestures. When they implement multipoint or path-based gestures, they must ensure that the functionality can also be operated via single-point activation. Examples of single-point activation on a touchscreen or touchpad include taps, double taps, and long presses. Examples for a mouse, trackpad, head-pointer, or similar device include single clicks, click-and-hold and double clicks.

This Success Criterion applies to author-created gestures, as opposed to gestures defined on the level of operating system or user agent. Examples for gestures provided at the operating system level would be swiping down to see system notifications, and gestures for built-in assistive technologies (AT) to focus or activate content, or to call up AT menus. Examples of user-agent-implemented gestures would be horizontal swiping implemented by browsers for navigating within the page history, or vertical swiping to scroll page content.

While some operating systems may provide ways to define "macros" to replace complex gestures, content authors cannot rely on such a capability because it is not pervasive on all touch-enabled platforms. Moreover, this may work for standard gestures that a user can predefine, but may not work for other author-defined gestures.

This Success Criterion does not require all functionality to be available through pointing devices, but that which is must be available to users who use the pointing device but cannot perform complex gestures. While content authors may provide keyboard commands or other non-pointer mechanisms that perform actions equivalent to complex gestures (see Success Criterion 2.1.1 Keyboard), this is not sufficient to conform to this Success Criterion. That is because some users rely entirely on pointing devices, or find simple pointer inputs much easier than alternatives. For example, a user relying on a head-pointer would find clicking a control to be much more convenient than activating an on-screen keyboard to emulate a keyboard shortcut, and a person who has difficulty memorizing a series of keys (or gestures) may find it much easier to simply click on a labeled control. Therefore, if one or more pointer-based mechanisms are supported, then their benefits should be afforded to users through simple, single-point actions alone.

-

An exception is made for functionality that is inherently and necessarily based on complex paths or multipoint gestures. For example, entering one's signature may be inherently path-based (although acknowledging something or confirming one's identity need not be).

+

An exception is made for functionality that is inherently and necessarily based on complex paths or multipoint gestures. For example, drawing letters for handwriting recognition or entering one's signature are inherently path-based (although writing or confirming one's identity need not be).

Gestures that involve dragging in any direction are not in scope for this SC, however, such gestures do require fine motor control. Authors are encouraged to provide non-dragging methods, for instance, a drag and drop operation could also be achieved by selecting an item (with a tap or keyboard interaction) and then selecting its destination as a second step.

From 255ac11777229169ae80f9d4da796efca6ce0b5a Mon Sep 17 00:00:00 2001 From: Detlev Fischer Date: Thu, 16 May 2019 21:37:46 +0200 Subject: [PATCH 05/10] editorial --- understanding/21/pointer-gestures.html | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/understanding/21/pointer-gestures.html b/understanding/21/pointer-gestures.html index f38584e018..2929e6bcbe 100644 --- a/understanding/21/pointer-gestures.html +++ b/understanding/21/pointer-gestures.html @@ -10,7 +10,7 @@

Pointer Gestures
Understanding SC 2.5.1

Intent of this Success Criterion

-

The intent of this Success Criterion is to ensure that content that can be operated via path-based or multipoint gestures can also be operated using single-point activation with a pointer. This is important for users who cannot perform complex gestures in a precise manner, or not at all. Some pointer users may lack the precision or fine motor control to carry out multipoint or path-based gestures. Some users with strongly impaired motor control use a pointing method via an adapted input device such as a head pointer, an eye-gaze system, or speech-controlled mouse emulation. These users rely on single-point activation.

+

The intent of this Success Criterion is to ensure that content that can be operated via path-based or multipoint pointer gestures can also be operated using single-point activation with a pointer. This is important for users who cannot perform complex gestures in a precise manner, or not at all. Some pointer users may lack the precision or fine motor control to carry out multipoint or path-based gestures. Some users with strongly impaired motor control use a pointing method via an adapted input device such as a head pointer, an eye-gaze system, or speech-controlled mouse emulation. These users rely on single-point activation.

A path-based gesture involves an interaction where the user engages a pointer with the display (down event), carries out a directional movement in a pre-determined direction before disengaging the pointer (up event). The direction, speed, and also the delta between start and end point may each be evaluated to determine what function is triggered.

Examples of path-based gestures include the swiping and dragging of elements that move in a constrained manner along one axis, for example, content sliders, swipe-to-reveal controls, drawer-type menus, or control sliders where a thumb can be dragged along a groove to set a value. Path-based gestures may be executed with a finger or stylus on a touchscreen, graphics tablet, or trackpad, or with a mouse, joystick, or similar pointer device.

Alternatives for single-point activation include arrow buttons moving a content slider in a stepwise fashion; a menu button revealing a drawer-type menu; increment/decrement buttons or a numerical input field next to a control slider offering value input; or equivalent static controls made available after activating an element that implements a swipe-to-reveal gesture.

From 6a487ab035037633c2f91d68a2568aed4dbeea0e Mon Sep 17 00:00:00 2001 From: Detlev Fischer Date: Thu, 16 May 2019 21:39:23 +0200 Subject: [PATCH 06/10] qualified 'pointer users' Making sure readers don't think "But there is a keyboard alternative". --- understanding/21/pointer-gestures.html | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/understanding/21/pointer-gestures.html b/understanding/21/pointer-gestures.html index 2929e6bcbe..254229cf49 100644 --- a/understanding/21/pointer-gestures.html +++ b/understanding/21/pointer-gestures.html @@ -10,7 +10,7 @@

Pointer Gestures
Understanding SC 2.5.1

Intent of this Success Criterion

-

The intent of this Success Criterion is to ensure that content that can be operated via path-based or multipoint pointer gestures can also be operated using single-point activation with a pointer. This is important for users who cannot perform complex gestures in a precise manner, or not at all. Some pointer users may lack the precision or fine motor control to carry out multipoint or path-based gestures. Some users with strongly impaired motor control use a pointing method via an adapted input device such as a head pointer, an eye-gaze system, or speech-controlled mouse emulation. These users rely on single-point activation.

+

The intent of this Success Criterion is to ensure that content that can be operated via path-based or multipoint pointer gestures can also be operated using single-point activation with a pointer. This is important for pointer users who cannot perform complex gestures in a precise manner, or not at all. Some pointer users may lack the precision or fine motor control to carry out multipoint or path-based gestures. Some users with strongly impaired motor control use a pointing method via an adapted input device such as a head pointer, an eye-gaze system, or speech-controlled mouse emulation. These users rely on single-point activation.

A path-based gesture involves an interaction where the user engages a pointer with the display (down event), carries out a directional movement in a pre-determined direction before disengaging the pointer (up event). The direction, speed, and also the delta between start and end point may each be evaluated to determine what function is triggered.

Examples of path-based gestures include the swiping and dragging of elements that move in a constrained manner along one axis, for example, content sliders, swipe-to-reveal controls, drawer-type menus, or control sliders where a thumb can be dragged along a groove to set a value. Path-based gestures may be executed with a finger or stylus on a touchscreen, graphics tablet, or trackpad, or with a mouse, joystick, or similar pointer device.

Alternatives for single-point activation include arrow buttons moving a content slider in a stepwise fashion; a menu button revealing a drawer-type menu; increment/decrement buttons or a numerical input field next to a control slider offering value input; or equivalent static controls made available after activating an element that implements a swipe-to-reveal gesture.

From da2e3e057750e4d11ef6dee587466618050bf641 Mon Sep 17 00:00:00 2001 From: Detlev Fischer Date: Fri, 17 May 2019 17:23:03 +0200 Subject: [PATCH 07/10] Added new tech and link --- understanding/21/pointer-gestures.html | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/understanding/21/pointer-gestures.html b/understanding/21/pointer-gestures.html index 254229cf49..2a86a38a3f 100644 --- a/understanding/21/pointer-gestures.html +++ b/understanding/21/pointer-gestures.html @@ -50,7 +50,7 @@

Resources

Techniques for SC 2.5.1 - Pointer Gestures

Sufficient Techniques

    -
  • GXXX: Do not rely on path-based gestures

  • +
  • GXXX: Not relying on path or multipoint gestures for operation

  • GXXX: Do not rely on multipoint gestures

  • GXXX: Provide controls that do not require complex gestures and perform the same function as complex gestures

  • GXXX: Single-point activation for spatial positioning and manipuation

  • From 9d46f7708305733826032824effdf35c42e77ab2 Mon Sep 17 00:00:00 2001 From: Detlev Fischer Date: Fri, 17 May 2019 17:28:57 +0200 Subject: [PATCH 08/10] Added and removed tech --- understanding/21/pointer-gestures.html | 3 +-- 1 file changed, 1 insertion(+), 2 deletions(-) diff --git a/understanding/21/pointer-gestures.html b/understanding/21/pointer-gestures.html index 2a86a38a3f..4672b7499d 100644 --- a/understanding/21/pointer-gestures.html +++ b/understanding/21/pointer-gestures.html @@ -51,8 +51,7 @@

    Techniques for SC 2.5.1 - Pointer Gestures

    Sufficient Techniques

    From 8f059be1cabd96d30d1c35f36ee63ae17cec0f5d Mon Sep 17 00:00:00 2001 From: Detlev Fischer Date: Fri, 17 May 2019 17:56:12 +0200 Subject: [PATCH 09/10] Added link to new tech --- understanding/21/pointer-gestures.html | 1 + 1 file changed, 1 insertion(+) diff --git a/understanding/21/pointer-gestures.html b/understanding/21/pointer-gestures.html index 4672b7499d..b469205949 100644 --- a/understanding/21/pointer-gestures.html +++ b/understanding/21/pointer-gestures.html @@ -52,6 +52,7 @@

    Sufficient Techniques

    From c07abc938591c6ad6cfc7a8ae3e2b1c44ec5ea92 Mon Sep 17 00:00:00 2001 From: Alastair Campbell Date: Fri, 24 May 2019 18:18:18 +0100 Subject: [PATCH 10/10] Simplifying the path-based gestures description. To differentiate from the other options. --- understanding/21/pointer-gestures.html | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/understanding/21/pointer-gestures.html b/understanding/21/pointer-gestures.html index b469205949..8247f2050d 100644 --- a/understanding/21/pointer-gestures.html +++ b/understanding/21/pointer-gestures.html @@ -11,8 +11,8 @@

    Pointer Gestures
    Understanding SC 2.5.1

    Intent of this Success Criterion

    The intent of this Success Criterion is to ensure that content that can be operated via path-based or multipoint pointer gestures can also be operated using single-point activation with a pointer. This is important for pointer users who cannot perform complex gestures in a precise manner, or not at all. Some pointer users may lack the precision or fine motor control to carry out multipoint or path-based gestures. Some users with strongly impaired motor control use a pointing method via an adapted input device such as a head pointer, an eye-gaze system, or speech-controlled mouse emulation. These users rely on single-point activation.

    -

    A path-based gesture involves an interaction where the user engages a pointer with the display (down event), carries out a directional movement in a pre-determined direction before disengaging the pointer (up event). The direction, speed, and also the delta between start and end point may each be evaluated to determine what function is triggered.

    -

    Examples of path-based gestures include the swiping and dragging of elements that move in a constrained manner along one axis, for example, content sliders, swipe-to-reveal controls, drawer-type menus, or control sliders where a thumb can be dragged along a groove to set a value. Path-based gestures may be executed with a finger or stylus on a touchscreen, graphics tablet, or trackpad, or with a mouse, joystick, or similar pointer device.

    +

    A path-based gesture involves an interaction where the user engages a pointer (down event) and moves before disengaging the pointer (up event). The direction, speed, and also the delta between start and end point may each be evaluated to determine what function is triggered.

    +

    Examples of path-based gestures include the swiping and dragging of elements, for example, content sliders, swipe-to-reveal controls, drawer-type menus, or control sliders where a thumb can be dragged along a groove to set a value. Path-based gestures may be executed with a finger or stylus on a touchscreen, graphics tablet, or trackpad, or with a mouse, joystick, or similar pointer device.

    Alternatives for single-point activation include arrow buttons moving a content slider in a stepwise fashion; a menu button revealing a drawer-type menu; increment/decrement buttons or a numerical input field next to a control slider offering value input; or equivalent static controls made available after activating an element that implements a swipe-to-reveal gesture.

    Examples of multipoint gestures include a two-finger pinch zoom, a split tap where one finger rests on the screen and a second finger taps, or a two- or three-finger tap or swipe. A user may find it difficult or impossible to accomplish these if they type and point with a single finger or stick, in addition to any of the causes listed above.

    Authors must ensure that their content can be operated without such complex gestures. When they implement multipoint or path-based gestures, they must ensure that the functionality can also be operated via single-point activation. Examples of single-point activation on a touchscreen or touchpad include taps, double taps, and long presses. Examples for a mouse, trackpad, head-pointer, or similar device include single clicks, click-and-hold and double clicks.