I have one flex app which supports full screen mode. In some machine, it looks fine, click "Esc" could quit the full screen mode. And I could use "Ctrl-C" to copy some text.
But in some machine, pressing "ctrl" key will exit the full screen mode. Then the mouse action become sticky, when I move popup screen via drag popup title, it keep moving even after I release mouse.
The problem sometime happen in the same machine which used to work fine. I am feeling that could be a flex defect.
Any suggestion?
Related
Following is a dummy implementation of our web application
https://roleapplication.herokuapp.com/index.html
appArea element has role application as it contains highly complex widgets such as ms paint/editor/ms office.
Navigator contains standard web widgets such as dropdown and buttons
The HTML is something similar to as specified below.
<body>
<div class="appArea" role="application">
.......//Complex widgets
</div>
<div class="toolbar">
......//Buttons, dropdowns
</div>
</body>
Keyboard functionality of appArea is handled by its code and for toolbar we rely on keyboard handling with the screen reader as they work in web browser.
Issue - When user press escape in navigator area we blur the navigator so the focus by default goes to body.
Now as focus is in body then arrow keys moves the focus to toolbar and therefore user is never able to go into appArea. If focus is in appArea it works fine.
Expectation - When focus is on body then on pressing down arrow focus should inside the appArea and then appArea will get the key instead of screen reader.
Check the down arrow key functionality when page is loaded with and without screen reader.
Keyboard notes
Press f6 to go from widget 1 to widget 2 to navigator
You can use arrow/tab keys in widgets to navigate.
Move to navigator using f6 and press tab to go to any button and then press escape. Now focus is on body(check using
document.activeElement).
Without screen reader our widgets captures the key on body and process it even if they dont have focus.
However with screen reader, when body has focus and user press down arrow, screen reader consumes the key and move the focus to navigator instead of application area which has widgets and user is unable to go to appArea using arrow keys or other keys which screen reader consume.
Note -
If we give role application to complete application then default arrow key handling of navigator will stop working which is not desired
Removal of role application is not possible as appArea is quite complex with hundreds of widgets all having their keyboard handling.
There are three ways to interact with role="application".
Hit enter on the application element, exit out of edit mode (or forms mode) and use the application as if it is another web page. You can put other elements there and the screen reader will move through those elements in brows mode.
Hit enter on the application which pops the screen reader into edit mode where all keys are passed to the edit widget inside the application. and you handle everything within your application, probably on a keydown event.
Control the tabindex as the screen reader presses keys using a roving tabindex.
You currently have 1 and 3 which is really confusing. If you removed the application element, it would still work just fine. It sounds as if you want 2 though. 2 is highly discouraged unless you have a screen reader user constantly testing UX or building your app. Number 2 is mostly for games and is considered the "canvas" element for screen readers.
You do 2 by doing the following:
<div role="application">
<input type="button" autoFocus="true" value="Click me" />
<p aria-live="polite" id="spk"></p>
</div>
The spk element is to send messages to the screen reader which you need to do in this Window, Icon, Menu, Message (WIMM) interface. Remember that in this mode, you need to program everything and users get upset if expectations are not met.
You said you are making a word processor. This last option (number 2), is NOT meant to make a word processor. As a screen reader user, I have expectations and workflows for Word processors. You can't get that functionality with programming it manually in Javascript.
Instead, use the existing edit fields HTML provides for this reason, such as:
This text editor example
Please let me know if there is some reason why you would not want to use the above widget.
You could get away with using 3 along with normal widgets, but it is better to do what Google Drive does and allow users to enter edit mode when the page loads, or press a key, like escape, to enter the tabindex application area (which does not need to be in an application element, although it can be).
Edit: After reading your question again, it sounds as if you can't figure out how to enter the application element. You arrow to where the screen reader says "application" and hit enter. To get out, you either tab to the next tabindex element that is outside the application or press the special key command to exit out of the application. In NVDA, this key command is ctrl+nvda+space. On your application, the application element is the first element.
role='application' should be used on rare occasions. As you noted, it causes all keyboard events to skip the screen reader and go directly to your app. This causes the screen reader virtual cursor to not work. Typically, a screen reader will automatically go into "application" mode (often called "forms mode") for certain types of widgets, such as an input field. If you are using widget roles, you will get this "forms mode" for free.
When you say "arrow keys" are not working, are you talking about up/down arrows or left/right arrows? They have different behaviors for a screen reader.
I'm building the UI for an application using Qt and QML for Ubuntu Linux. I have a viewer window with a canvas element which is supposed to be fullscreen by default. On opening the application this works fine (i.e. Ubuntu sidebar and top taskbar are hidden). However, once I minimize my application and then maximize it again by using viewer->setFullScreen();, the Ubuntu sidebar and top taskbar are still visible and there is an offset while writing on the canvas due to the same.
Any help would be appreciated.
According to this topic on askubuntu, your problem do really looks like Unity bug (or feature). But, according to somehow related bug on Launchpad, it seems that you can get desired behavior by:
Turn "Always On Top" on via right-clicking the titlebar of your window, before making it go fullscreen.
This will prevent the Unity panel from rendering on top of this fullscreen-window, when using the other screen.
In Qt you can set Qt::WindowStaysOnTopHint to your window/widget via QWidget::windowFlags.
Pay additional attention to notes in official documentation:
This function calls setParent() when changing the flags for a window, causing the widget to be hidden. You must call show() to make the widget visible again.
About Qt::WindowStaysOnTopHint -- Informs the window system that the window should stay on top of all other windows. Note that on some window managers on X11 you also have to pass Qt::X11BypassWindowManagerHint for this flag to work correctly.
Hope this helps.
I have a Qt application with borderless window. So, I am creating my application window with CreateWindow() and then I have QWidget which uses the HWND.
Everything works fine, except that when the application is maximized and the Windows taskbar is set to auto-hide. In that case, if I click on the application icon, the program is not minimized. Even stranger is the fact that this only happens if the application is on my primary monitor. If it's on another monitor, everything works fine. Or, if the taskbar is not set to auto-hide, everything works fine on any monitor.
When the taskbar is set to auto-hide, on my primary screen the taskbar wasn't even showing up, so I am showing it with ShowWindow(hTaskbar, SW_SHOW). When it shows up, everything works when I click on the icons of other applications, so there must be something wrong with mine, but I am not sure where to start.
You need to handle WM_SYSCOMMAND and respond to SC_MINIMIZE.
WM_SYSCOMMAND on MSDN
I need to have app running in fullscreen mode.
For that I used QDesktopWidget. When windows taskbar is lock it work fine.
Problems start when taskbar is in auto-hide mode.
I cannot find any way to receive information (signal), that taskbar size on desktop changed,
so I cannot react and change my widget size.
Is there any way to obtain information that windows taskbar changing from hidden to visible or opposite?
I would be glad about any hints.
Marek
I don't know why you use QDesktopWidget for running in fullscreen mode! You can simply set your MainWindow state to fullscreen by :
this->setWindowState(Qt::WindowFullScreen);
I want to hide system cursor for 10s for some reason ,but I found
cursor.setShape(Qt.BlankCursor)
can only hide mouse cursor that is associated with QWidgets ,not in system wide ,i.e. when mouse cursor is hovering on QWidgets, it is invisible ,otherwise it is visible ,so is there any way to hide system cursor in system wide?
The win32 system call ShowCursor works per-window only. You can access this from either ctypes or pywin32's win32api. But apparently the cursor drawing is controlled by display driver and can only be affected by specific windows. You can't force another window to hide its cursor. Two options:
use ShowCursor(False) on your window, and for the display background, create a root window application that you spawn from your GUI app, it hides cursor; your app would cause it to exit after 10 seconds, but again if user moves mouse over other app windows they will see cursor.
make your application a root window application; then while in view, ShowCursor(False) will make cursor disappear everywhere on screen except system toolbar (which is a good thing).
I don't think it is a good idea anyways; what if your app crashes while the mouse is hidden? Then user can't use their desktop easily. Definitely good reason that this is not allowed.
Best approach is to think of a different solution to whatever problem led you to try cursor hiding.