How to convert Buffered Image to PlayN Image? - playn

I came across an interface which transforms java buffered images to playN images
http://code.google.com/p/playn/source/browse/java/src/playn/java/JavaBitmapTransformer.java?spec=svn16f0f5b72f732f47b05c2857e1b64bc6bbe5c14f&r=16f0f5b72f732f47b05c2857e1b64bc6bbe5c14f.
But i was not able to figure out. Can any one know how to use it exactly?.....
Thanks in advance...........

You can't use it directly in a cross-platform way.
This interface does not transform a BufferedImage to a PlayN image. It transforms one BufferedImage into another BufferedImage. This new BufferedImage will subsequently be used to create a PlayN image in the Java platform specific code of PlayN.
This interface is used under the hood to support the transform method in Image.

Related

Lib to display OpenGL video in PyQt?

I know that a classic way of displaying OpenGL elements inside (Py)Qt is to override the QGLWidget.paint() method to make direct OpenGL calls.
However, does anyone know of a lib I could use to avoid making direct calls to OpenGL? Something that would, for example, load an image or video frame to a textured triangle strip in a single call? Or any kind of OpenGl video lib that I could integrate with PyQt?
My final goal is to display OpenGL video inside a PyQt window.

How do I render a progressive JPEG on a QWidget as it is being downloaded?

With Qt, rendering a fully downloaded image on QWidget is fairly easy. However, I want to render a partially downloaded progressive jpg (or png) as it is being fetched.
As I remember only GIF format suitable for that purposes. And I not see way to do this with QT tools and libraries. Think you need found way to covert partialy downloaded (gif) image to QPixmap or QImage and show it.
You can render the image on a QGraphicsView. For this purpose, you can write a QGraphicsItem which has two QImage pointers as member variables. One of them can be used as a buffer so that the newcoming bytes can be written into the buffer. The other image can be used for rendering. paint() function should be as below
painter->drawImage(0,0,*mRenderImage);

How to render custom video data in Qt ?

I have never done a Qt video application - newbie in this area. I have a custom video file format to render using Qt. The video file format is 256-byte header, then pixel data, then 256-byte header, then pixel data and so on.
The header consist info like width, height in pixels, bytes per pixels, frame rate etc and the pixel data is in Bayer (GBRG). I may have to process data before display - like convert to RGB (not sure yet). I see there are lot of video related classes like QGL*, QMovie, QVideo* ... don't know where to start ? I have seen mandelbrotwidget example which looks like a good place to start but still need some advice. Also, do I have to write a Qt Image plugin for Bayer pattern?
Thank you.
Good advice is to do all by yourself. If you have simple data structure read it by simple C++-code.
Conversation from Bayer RGB to RGB like here you can also make without using any Qt objects.
And now, when you'll got trivial RGB-image (even in your own structure) per frame, you can show it on widgets such as QGL* -- if you prefer OpenGL rendering, or Qt's classes such as QPainter, QImage...
Some more links: C++ GUI Programming with Qt4, 2nd Edition, Graphics View Framework
The best place to start is by learning the basics of custom drawing in Qt. In short, a very simple implementation would require you to:
Create a custom QWidget subclass
Override the paintEvent() method
Use a QPainter/QImage to decode your raw video data into image data and draw it on screen
Qt has lots of good sample code to get you started, such as:
http://qt-project.org/doc/qt-4.8/examples-painting.html
Once you have a simple implementation up and running, and a basic grasp of Qt concepts/classes, then you'll be ready to profile, optimize, and make use of more advanced Qt functionality (GL, video) as needed.

Flex 4: VideoDisplay can play MP3 files?

Is it possible to play MP3 files using the VideoDisplay or VideoPlayer components?
Thank you.
Actually, yes they can play MP3 files. I've just got it working by simply passing the path of the MP3 to a VideoPlayer component instance.
Although I wouldn't recommend using a video component to solely play audio files, I agree that it's sometimes appropriate to play a sound file in a video display component. In my case I have a mixed list of audio and video media items and want a unified preview area and playback/scrub controls.
Why would you use a video component to play a sound file? Either way, you should probably google before posting here. This is how you do it:
var snd:Sound = new Sound(new URLRequest("smallSound.mp3"));
snd.play();

Is it possible to use imagemagick as library reference from Adobe AIR?

I'm trying to load preview thumbnails of high resolution images and the application needs to be able to load 100 hi rez images at one time. The only way i know how to do this is if I use the Loader class and I have to load the ENTIRE file and then scale down the image and use the data as an image preview.
what i'd like to do is use imagemagick (or some other efficient image manipulation lib) to compress the image and return that result (without saving it as a file first... that would be optimal) back to my AIR application so that i can use it as a preview. This would be AWESOME.
Thanks in advance.
CommandProxy by Mike Chambers. Look it up (http://www.mikechambers.com/blog/2008/01/17/commandproxy-net-air-integration-proof-of-concept/). This nifty trick allows you to have a conversation between AIR and non-AIR apps.
Still have questions? Let us know.

Resources