The Touch support is built in to the RTL, and no explicit code is needed for your program to respond to Touch input. The RTL implementation is based on the Windows 8 API.
*Currently for Windows 8 and 8.1 Microsoft has removed the OS level functionality for Desktop apps to automatically show the On-Screen Keyboard when a text-like (Entry, Textbox, Drop Combo, etc.) control gains focus. We have added support in the “Enhanced Focus” template code to workaround this limitation.
To enable your programs to automatically display the On-Screen Keyboard go to Global Properties→Actions-App Settings and check the box for "Provide visual indicators on control with focus".
Then choose the desired radio button for the On Screen Keyboard.
The options are:
On - Your program will attempt to launch the On-Screen Keyboard on any device. If the On-Screen Keyboard
isn't available no error is displayed to the end user.
Off - Your program will not try to launch the On-Screen Keyboard.
Auto - Your program will ask the OS if the device supports Touch input. If the OS indicates the device
supports Touch input the On-Screen Keyboard is set to activate on text-like controls, otherwise it is
set to Off.
Enabling and disabling of the On-Screen Keyboard can also be set globally for the application at runtime by calling the
SetOnScreenKeyboard() method of the EnhancedFocuManager class.
If you don't want the visual indicators to display but you do want the On-Screen Keyboard to automatically display, press the “Set visual indicators” button and turn off the checkbox “Change Color” and on the “Box indicator” tab turn off the checkbox for “Display Box”.
For flexibility and lower level control over Touch input you have the option to use interfaces to take control of responses to all Touch events.
The files that are used to work directly with Touch input are located in your .\LIBSRC\Win folder, they are CWTOUCHDEF.INC, CWTOUCH.CLW, CWTOUCH.INC.
CWTOUCHDEF.INC
In CWTOUCHDEF.INC you'll find the Clarion PointFlags ITEMIZE structure. The PointFlags defines constants which represent the flags possible for the pointerFlags field of the Windows API POINTER_INFO structure:
for more info: http://msdn.microsoft.com/EN-US/library/windows/desktop/hh969211%28v=vs.85%29.aspx
Constants defined in the InputFlags ITEMIZE structure are a combination of MK_* constants (WINUSER.H), and flags used in the dwFlags field of the Windows API GESTUREINFO structure (shifted right by 8 bits).
for more info: http://msdn.microsoft.com/en-us/library/windows/desktop/dd353232%28v=vs.85%29.aspx
There are also EQUATEs defined that correspond to flags set by the Clarion runtime. You'll find EQUATES for PointerTypes, Touch Actions, and Touch Gestures.
CWTOUCH.INC
All the structures in this file can potentially be linked in lists:
- multiple touch points can be active simultaneously
- some gestures can be decoded to a combination of “standard” gestures, for example, two-finger pan gesture + a zoom gesture.
All the structures have a LONG Cookie as a first field and a virtual Next method. The Next method uses the Cookie field to find the next linked object if there is one. Your program should not modify the Cookie field.
The InputPoint and InputData structures contain information about gesture points and other gesture information. The TouchPoint and TouchData structures contain information about raw touch points and other touch information.
The InputPoint and TouchPoint structures contain fields named pt and ptLocal which both contain coordinates of the touch point, but pt contains the position of the point in screen coordinates, and ptLocal has the position in WINDOW or control coordinates (depending upon the Ctl field of the InputData or TouchData structures).
InputData structure
W: is a reference to the WINDOW receiving touch input
Ctl: contains the feq of the control receiving the input event
ia: input action; this value is a constant from the InputAction ITEMIZE structure (CWTOUCHDEF.INC)
ptAction: “action” point for current input event, for example, mouse position
pt1 and pt2: additional points involved into current input event; dependent on the event's input action
buttons: (SHIFT, CTRL, mouse or pen buttons) that were pressed at the time the event was generated
param: flag parameters of the event, dependent upon the input action; any gesture can have following flags;
INFLAG_BEGIN - gesture begins
INFLAG_END - gesture ends
The other fields in the InputData Class contain additional gesture data.
Some of the most common cases:
If ia = GESTURE_PAN:
ptAction: contains coordinates of the finger, or of the center point between two fingers
pt1: NULL in the case of a two-finger pan, or the same value as ptAction for a one-finger pan
pt2: base point of gesture: action point at the moment the gesture begins (when param is set to the INFLAG_BEGIN flag)
param - can have following flags:
INFLAG_INERTIA - pan has inertia
INFLAG_VERTICAL - pan has vertical component of finger(s) movement
INFLAG_HORIZONTAL - pan has horizontal component of finger(s) movement
INFLAG_TWOFINGERS - two-fingers pan
distance: distance between fingers in case of a two-finger pan
scroll: x- and y- offsets of the action point since previous input event for the same gesture
speed: current inertia speed along x- and y- axis in the case of a pan with inertia
(see description of the ullArguments field ofthe GESTUREINFO structure)
for either GESTURE_ZOOMIN and GESTURE_ZOOMOUT:
ptAction: center point of the zoom
ratio: zoom ratio (>= 1)
for GESTURE_ROTATE:
ptAction: center of rotation
angle: cumulative angle since rotation has started
for GESTURE_TAP:
ptAction: center between two fingers
pt1: coordinates of first finger
pt2: coordinates of second finger
param: has the INFLAG_TWOFINGERS flag
distance: distance between fingers
for GESTURE_PRESSANDTAP:
ptAction: coordinates of the finger came down first
pt1: same as ptAction
pt2: coordinates of the second finger
delta: the delta between two finger alongthe X- and Y- axis
More information can be found on the following pages:
http://msdn.microsoft.com/EN-US/library/windows/desktop/dd353242%28v=vs.85%29.aspx
http://msdn.microsoft.com/en-us/library/windows/desktop/dd353232%28v=vs.85%29.aspx
TouchPoint structure
ID: system identifier of touch point
PTType: the type of touch device
ia: input action
target: handle to a window/control which is a target of a touch event
PTFlags: point flags - combination of PointFlags flags
buttons: keyboard and mouse/pen buttons pressed at the time of the event
param: INFLAG_HORIZONTAL or INFLAG_VERTICAL flags for mouse wheel events - combination of PEN_FLAGS flags for pen events.
downtime: time (in milliseconds since system start) of this point of contact
lastmovetime: time of last detected movement of this point
updatetime: time of last update of this point position or state
more info: http://msdn.microsoft.com/EN-US/library/windows/desktop/hh969208%28v=vs.85%29.aspx
Other fields are taken from the POINTER_INFO, POINTER_TOUCH_INFO and POINTER_PEN_INFO structures depending upon the type of device being used.
more info:
http://msdn.microsoft.com/EN-US/library/windows/desktop/hh454907%28v=vs.85%29.aspx
http://msdn.microsoft.com/en-us/library/windows/desktop/hh454910%28v=vs.85%29.aspx
http://msdn.microsoft.com/en-us/library/windows/desktop/hh454909%28v=vs.85%29.aspx