How to Create Gestures in Altia Design

Creating gestures in Altia is indeed possible. Since implementation varies depending upon the gesture, the GUI design, and the performance of both the hardware and its touchscreen, we do not have generic documentation to provide.

However, a simple high-level overview which conceptually describes how you create gestures in Altia Design is provided below, and there is also a video on this topic posted at this link:

For an example dsn, click below to download a working motion tracking and LeftDown/LeftUp click coordinate capture (800x600 with per-pixel tracking):

https://altia.zendesk.com/hc/article_attachments/115000138450/gesture_setup_altiaSupport.zip

On any given touch screen, Altia Design can interpret a press as a "Left Down" Stimulus input event, and a release as a "Left Up" Stimulus event. In the case
of a fling, a bit of logic is executed from these events that does the following:

  • Record x,y coordinate of down event
  • Record x,y coordinate of up event
  • If the coordinate delta (the difference between the 2 coordinate locations) of the press event and release event is large when the release event comes in, this should cause a scroll (or if you have not detected an release event yet, but the coordinates are changing, that should cause a drag/scroll event).
  • If you detect a release up event, you need to calculate some things based upon time. If the delta is large, and the time is short, that should cause a fling event (a fling event itself usually sets a "destination" scroll value to be much higher than the current scroll position and kicks off a time-based scroll event to arrive at that location automatically as a result of the "fling" event). Conversely, if the delta is large but the time between press and release is long (or if the delta is simply to small for a fling) this should not fling, but maybe just nudge it a small bit from its current location to make it "feel/look" smooth. 

 What you scroll, how you scroll, (essentially, what moves, and what it looks like) is all fully customizable and created with objects in Altia Design. Embedded GUIs are not windows in an OS, so there might not ever be a scrolling region, or if something does scroll/fling, it will need to do it in a custom way per the design of that interface, so this is why the overall implementation can be highly dependent upon the design intent/behavior of the screen in question.

However, the principles above apply to all of the scenarios with respect to calculating what has happened with user input, and how to best act upon that
input. If you would like further support on this topic, and would like to see a specific example fleshed out for your needs, we can continue to support you on implementation (please submit a new request/ticket to ask for additional support on this).

As for multi-touch, that gets even more differentiated at the driver level, since it would depend upon the format/output of the touchscreen controller when receiving multi-touch events (assuming the touch screen in question supports multi-touch of course). However, this highlights more flexibility in the Altia tool chain. For example, when generating for a target with an OS, let's say Android, you can choose to stop at the high level, creating your Altia GUI executable as just an Android "app" and let Android tell the GUI it has detected a swipe, or you can bypass the OS entirely and have your application talk to the hardware to do custom stimulus/logic based gestures as discussed before. The last statement also highlights the ability to have DeepScreen generate code for either high level apps or deeply embedded hardware optimized GUIs that require no OS at all.

Of course, if you're doing the Stimulus/logic based events for multi-touch, the format of the touchscreen controller's output would have to be properly extracted and sent to Altia Design as some form of Left Down and Left Up event structure. You can see that this question can only be best answered if a specific implementation is in play.

In the end, this all happens in code, and nothing is left to magic. Since it's very flexible (and variable) it's really up to you/your project what the best way to tackle it might be. However, Altia has a lot of mind-share to support you on this decision and/or implementation process.

Have more questions? Submit a request

0 Comments

Please sign in to leave a comment.