I would like to make an input method which is used only for Qt desktop application.
It like Chinese(Pinyin) input method in windows. Include script processing, rendering of words.
As it includes rendering of words, it can't be created with Keyboard Layout.
More over, when built-in with application, it can be use cross over other platform.
But, It not like on-screen keyboard.
Thanks for all
The Qt way to implement this is to provide an input method plugin, see general plugin development docs and the input method specific base class.
With this you should be able to implement your own input method. Stuff like script processing and rendering is then up to your own plugin.
Related
I designed a Qt application by Qt creator. As you well know when you build a new form is possible to drag an drop the default items inside the main window. Instead of use classical "push button" I created a custom button by adding .qml file to the project. The problem now is that I don't know how I can use (or integrate) the new button inside the form of my project.
Thanks in advance
As long as your component is in the path of your application qml files, all you need to use your component is to place it somewhere. You don't need to include or import anything. Any user component is directly available to the entire project.
As long as the QML component is made from only built in components, it can even be safely loaded from arbitrary location on disk, over network or just from a source string. Check this answer for details on dynamic instantiation.
A friendly advice - type the code, do not use the visual editor - it is pretty weak.
EDIT: I don't know about you, but for me, seems like every custom qml file in the project qml folder is automatically added to the QML types in the designer library. So contrary to what I assumed, you shouldn't really need to do anything to get your custom type available for use in the designer.
I found this 1 year-old question when facing the same problem. This is what I found out:
If you save your custom button with an initial uppercase letter (CustomButton.qml instead of customButtom.qml), QtDesigner shows your component in the library panel correctly.
Sometimes you need to restart QtDesigner to work.
When it comes to designing a GUI in Qt, I am hesitating between using the designer in Qt Creator, or doing everything in source code. I'm using Qt widgets and not QML.
If I use the designer I can easily create a GUI using qt standard widgets. But as soon as I need to subclass a widget to extend its functionality I have to build a Designer plugin to support my new widget. Is that correct? Or is there another way to it?
You can build all the GUI in Designer including custom widgets, and you can also build your custom widgets in Designer.
Designer does not need to interpret your custom widgets. Just use the promote functionality. With promote, you start with a plain widget within Designer and then tell the "real" class of it (your custom one) and the header file where it is decleared. The only drawback is that within Designer, it will stay looking like an empty widget.
In my experience, it is much better to use Designer for the GUI than writing source code yourself. You can easily change all the properties afterwards etc., and it is helpful even if you rely on custom widgets. Source code is not a good declarative language for GUI objects, with all the properties etc. Also you cannot play around, you would need to compile all the time just to tell "Is it better to have this text label in bold font?".
Sometimes I edit the XML files that are created by Designer by hand. For example, if I want to put a widget somewhere else in the object tree. If you don't mess up the XML, Designer will still read it and not destroy your changes. The only reason I see for writing GUI in source code is when you have repetitive elements, or dynamic changes based on data input, e.g. a for()-loop that produces elements. In my project I have some Selector Boxes that are filled with options in the source code.
And btw: If you prefer to write your GUI in code instead of using Designer, maybe you are not the right person to craft the GUI. Most programmers don't understand that while they are technically able to design a GUI, they are not always also competent in doing it.
http://hallofshame.gp.co.at/index.php?file=shame.htm&mode=original
It is a bit of a shortcut, but I often use a simple QWidget as a container for my custom widget. This way, I can setup sizing policies, put the whole thing in the layout I want before my custom widget is even in. Then, in C++, I add the custom widget as a child of the container widget.
edit: As ypnos mentioned you can promote the placeholder directly. You can find guidelines here
I have a custom form that I wrote manually without using Orbeon Form Builder. I want to save the XML file to Alfresco when the form is submitted. This can be achieved with forms created with Form Builder using the workflow-send button. My question is how to use this button with forms that were writtin manually.
That button and the Alfresco functionality are part of Form Runner and they were not meant to work independently. This is not to say that it's not possible, but you will have to look at the source code and get at least a minimal understanding of it.
The place to start for Alfresco support is alfresco-model.xml. You might be able to start by including this model with XInclude in your own form. Then, that model has at least a dependency on an instance called fr-parameters-instance which provides the form and app name. This is used to read configuration properties.
The second place to look is persistence-model.xml, which is the place which actually uses alfresco-model.xml.
I just wonder what will be the best solution for receiving text input from user in PlayN.
I didn't find anything that i can use to achieve this, i think that the best solution will be to render something like HTML inputs to write a text, but it will be not that simple because we need to be able to use in example virtual keyboard from android (on android platform) and regular keyboard on HTML backend. Even then i think it will be very difficult (or impossible) to evoke android keyboard in game...
I'm thinking about creating a widget in tripleplay UI library (because i will use it), but this will end with rendering virtual keyboard on screen for user inputs.. buttons from a-z etc...
I wonder is there any better solution for this, or i need to implement something like i wrote above (like tripleplay widget)?
There is already a Tripleplay widget for receiving text input called Field.
However, it is very primitive and does not yet work on mobile platforms (it will work on an Android device with a hardware keyboard). We need to provide an API in PlayN to display the virtual keyboard, but until then there's no way for it to trigger the display of the virtual keyboard.
I don't recommend using this for any substantial text input, however. It doesn't (and never will) support cut and paste, or language input methods, or anything of the other extremely complex features that users expect for text input.
I would like to add an API to PlayN like:
Keyboard.requestTextInput(String label, Callback<String> callback)
which would pop up the virtual keyboard, with an attached (native) text box, and allow the user to enter a single line of text using all of the machinery of the platform's native text entry support. This will allow them to cut-and-paste, and use language input methods, and provide an experience with which they are comfortable on the platform in question.
If your game needs more sophisticated text input (like a chat interface, or the ability to take pages of notes), you will probably have to create a separate interface for each platform that you wish to support, using native multiline text editing widgets and then "wire" those into your PlayN game. This will be more complicated than can be described in a simple SO answer, so you'll have to do some research and learn how PlayN manages the display on each of the backends that you wish to support.
I need an appropriate rendering engine to render a 3D model into Qt, and use Qt then as an event handler to this model. Thank you.
As for the rendering engine, you can go several directions, but I'll mention two. Qt comes with an OpenGL widget. You could make use of this by either:
writing your own rendering code to render your model using OpenGL
or
making use of a rendering engine/framework which has it's own Qt Widget (possibly derived from Qt's OpenGl widget). I know OpenSceneGraph has Qt integration available. And I seem to remember Ogre does as well. This just to mention two options. Just Google for Qt and your favorite engine and you'll most likely find something suitable. And if not, it is usually not that difficult to write your own integration if you feel like it.
Whichever option is more suitable for you depends on the exact specification of what you're trying to achieve. You probably know that better than we do.
As for the event handling, you might want to be a bit more specific as to what you mean.