It’s Time to Stop Using Three-Button Navigation on Android: Stop Tapping and start Swiping
há 2 anos
I've resisted this for a bit, but finally gave in and tried it out a week back. I opted for the three bottom swipes, so everything is still actually in the same position (to make the transition a bit easier) but I may still progress to the side swipe.
Yes, the muscle memory does actually kink in quite quickly, and I'm sticking with it. It gains a bit of extra screen space for you, but the theory too is that it is quicker to just swipe, than to aim for the actual buttons as a press. Also having recently switched back from using an iPhone, it actually feels a bit similar as iOS uses the swipe up for return to home screen, and swipe up and sideways to see recent apps.
See It’s Time to Stop Using Three-Button Navigation on Android
#technology #Android #gestures #navigation

Yes, the muscle memory does actually kink in quite quickly, and I'm sticking with it. It gains a bit of extra screen space for you, but the theory too is that it is quicker to just swipe, than to aim for the actual buttons as a press. Also having recently switched back from using an iPhone, it actually feels a bit similar as iOS uses the swipe up for return to home screen, and swipe up and sideways to see recent apps.
See It’s Time to Stop Using Three-Button Navigation on Android
#technology #Android #gestures #navigation

Following in the steps of the iPhone, Google added gesture-based navigation to Android in 2018. It’s been a while since then, yet many people still cling to the old three-button navigation system. Allow me to make the case for gestures.
EyeMU Interactions: Experimental Gaze Tracking and Gesture Control for Mobile Devices
há 2 anos
This repository (linked to post) houses the code for the paper EyeMU Interactions: Gaze + IMU Gestures on Mobile Devices. They explored the combination of gaze-tracking and motion gestures on the phone to enable enhanced single-handed interaction with mobile devices. You can see links to their open-access paper for further details, as well as their demo video.
I actually tested this on a desktop webcam, and even that actually works for gaze tracking (to some extent).
The biggest problem, for me anyway, is remembering what gestures to use, so an app to help practice a bit (like a game) can really help get into that muscle memory type approach. Eye tracking does seem quite natural, and I like the way they use the eye tracking to eliminate false positives with gestures.
At the link below, are some links to the demo sites you can try out from your mobile device (hint go for the Playground link to avoid very lengthy calibrations), no software to be installed.
See GitHub - FIGLAB/EyeMU: Gaze + IMU Gestures on Mobile Devices
#technology #opensource #assistivetechnology #gestures

I actually tested this on a desktop webcam, and even that actually works for gaze tracking (to some extent).
The biggest problem, for me anyway, is remembering what gestures to use, so an app to help practice a bit (like a game) can really help get into that muscle memory type approach. Eye tracking does seem quite natural, and I like the way they use the eye tracking to eliminate false positives with gestures.
At the link below, are some links to the demo sites you can try out from your mobile device (hint go for the Playground link to avoid very lengthy calibrations), no software to be installed.
See GitHub - FIGLAB/EyeMU: Gaze + IMU Gestures on Mobile Devices
#technology #opensource #assistivetechnology #gestures
Gaze + IMU Gestures on Mobile Devices. Contribute to FIGLAB/EyeMU development by creating an account on GitHub.