How to click a button when the frame is highlighted after SHIFT+TAB on mac?
Is this not supported in the OSX app as part of accessibility/screen reader feature in MS Teams and description of how to navigate in a view?
I can't figure out i.e. how to click Admit all or Mute all.
FYI - I'm writing applescript to press the required keys sequences accordingly.
EDIT: FYI - the web client supports selecting the button, I have to use the desktop app though for other features.
I tested the desktop version of Teams (1.4.00.26376) on Windows 10; similarly to the OP I also found I couldn't access the 'Mute All' button when focus was on the 'Attendees/In this meeting' accordion panel. But, I could access the 'Mute All' button when I used the Applications Key (or Shift F10) a context menu appeared with the 'Mute All' option and I could access it with the keyboard.
Related
How to switch to a different task, or view the previously opened windows in eWAM without using the mouse? I have tried ALT+TAB it is not working.
Due to the "modal" nature of Wynsure it is often not advised to switch between screens when in a multi step process for sub process. However, when relevant you can use Ctrl+F6 to cycle between open windows as an alternative to selecting them through he "View" menu with the mouse. Here is a list of the most popular Wynsure shortcuts
Ctrl+F2 to insert the current date in any date field (must be active)
Ctrl+F4 to close a window
Ctrl+F6 to cycle between open windows
Ctrl+F10 to toggle a window from "maximized" in the workbench or "restored" as a separate window in from the the application.
We're using JAWS to test accessibility in our web application on IE11. One of our controls requires a CTRL + click to bring up a context menu. Is there a way to do this in JAWS with keyboard commands?
Thank you
This is basically not a good practice to attach context menus to CTRL+Leftclick in an accessible application, unless you have special messages about that. You should think about intercepting the standard context menu keys (Applications/SHIFT+F10) instead.
However, there is a key combination for that in JAWS, indeed: Ctrl+NumPadSlash, since NumPadSlash simulates the left mouse button click in the JAWS cursor position.
But please note that you navigate the webpage with virtual PC cursor, and not JAWS cursor. So to carry out your command, the user first has to route JAWS cursor to virtual PC cursor (Insert+NumPadMinus), and then execute the CTRL+LeftClick. This is an extremely uncomfortable solution since it is not obvious at all that in this particular place I have to route JAWS to PC and then CTRL+click.
Please think about a better approach for JAWS users.
While debugging in Qt Creator (ver 3.4.2), if I hit the escape key (which I tend to do often to declutter my work space), then all of the debugger views including the debugger toolbar become hidden (as expected), but later I can't get them back. If I go under Qt Creator's main menu->Window, then Views is disabled.
Here's an example of a basic window before I click the escape key. Notice I have all of the debugging views showing (i.e. Breakpoints, Stack, Locals and Expressions, etc...)
Here's an example of my window after I've clicked the escape key. Notice how all of the debugging windows are hidden (as expected). My question is, now how do I get the windows back? You can see how the "Views" submenu under the "Window" menu is disabled.
Is there some sort of "Show Debugger Toolbar" keyboard shortcut? Or is there another menu somewhere to get this back? Any help would be much appreciated.
Under the Window menu, enable Show Mode Selector. This will show a strip down the left of your window where you should see a Debug tab you can click on to put Qt Creator back in Debug mode.
I am working in a project which uses the Facebook graph-api to log in. I have the requirement of only using a virtual keyboard (no hardware will be present). I have looked everywhere, but can't find a solution for adding a virtual qwerty keyboard to the popUp.
I can put the keyboard into a popup, or I could add the qwerty keyboard into the screen with the addChild() method, but I still have one problem: the virtual keyboard does not focus to the textInputs of the popup and when i press a key, everything "explooota".
Anyone knows how i could solve the focus problem?
I mean... when i prees the virtual key, i call a java function wich simulate a physical keyboard, but i lose the focus into the facebook input text and the letter is not in the textinput... and i dont know how to recover the focus...
Thanks in advance for the help!
We had the same problem with a desktop app written in C#. I can only answer for a windows based application. Assuming you are working on a desktop app and that you are showing the login in a web browser control you can use the SendInput API to direct keyboard-like input to a field in the browser. We had our own custom keyboard; I don't think you will be able to use the built-in on-screen keyboard MS provides.
We had a windows form that hosted a web browser control and the keyboard custom control. The user touches the field that they want to fill in. The user types their input using the on-screen keyboard, the keyboard uses SendInput to send the appropriate character for the key that was touched to the web browser control. Other problems to look out for:
the facebook login form takes a lot of space, having both the keyboard and login visible at the same time is difficult
sending non-ascii characters; see this for help (SendInput sequence to create unicode character fails)
the user will have to touch to select the input field
there are other links on the FB login page you may want to restrict (like create an account)
an on-screen keyboard where touching the key doesn't steal focus from the browser field
These can all be solved but they are not trivial.
This has been bugging me for a while. In Xcode 4, sometimes this menu item is enabled, sometimes it is disabled. I cannot figure out why it is ever disabled, and there seems to be nothing at all on Google about this.
I have this same problem. If I click on the "Show assistant editor" button (the middle button in the list of Editor buttons located in the upper-right hand corner) and then back again to "Standard Editor" (the left-most button in the list of Editor buttons) then the "Find selected text in workspace..." function is enabled. But I have to do this often, but only in the projects I created before Xcode 4. So I think some setting in the project was not created properly when Xcode 4 converted it over.
I have found if you just right click on the word, without selecting it prior, then the menu selection will be enabled. This seems to be more prevalent in xCode 4.4.1. I have also noticed that when you select other words will "trigger" the menu to enable also. Hope this helps.