Jailbreak app - Play a background music during phone call - jailbreak

I'm in the process of writing an jailbreak iPhone app that will play a background music during the call.
Is there a way to play sounds or music in the background WHILE on a call so that the person can hear it too?

Wouldn't this be a tweak? Trying hooking into the Music or Messages app. %hook whaterveryouwanthere
//Enter code here
%end

Related

A-Frame - VR Mode mouse control / move in browser

I'm really new with A-Frame / Web VR. I'd like to ask you when I am on my mobile in VR mode how can I prevent the controll / camera move? For example when I switch to VR mode I want to use my usb mouse for moving just like in "normal mode" and not the device's sensor. Is it possible?
Thanks.
I don't think this is currently possible. Another potential issue with this approach is that it will likely cause simulator sickness:
The most important guideline in designing for virtual reality is to always maintain head tracking. Never stop tracking the user’s head position inside of the application. Even a short pause in head tracking will cause some users to feel ill.
https://designguidelines.withgoogle.com/cardboard/designing-for-google-cardboard/physiological-considerations.html
That said, if you're purpose for this functionality is for spectating on a 2d screen, A-Frame 0.8.0 allows you to add a spectator camera:
https://aframe.io/docs/0.8.0/components/camera.html#properties
There are no tutorials for this yet, but you could try adding asking another question here or on a-frame slack on how to get it up and running.

A-Frame: how to simulate tracked controllers when developing on desktop?

My HTC Vive is set up in a different room to my developer workstation. When developing for A-Frame, I understand I can: use my desktop monitor instead of a headset; use mouse dragging instead of motion controls; use WASD instead of room-scale tracking. However, what is the preferred way to simulate the tracked controllers?
For example, how can I move the cubes in this demo from my desktop: https://aframe.io/examples/showcase/tracked-controllers
This is not yet released, but we're working on tools to be able to record camera and controllers, output to a file, and then you can load it up any device and replay the camera and controller pose and events without needing a headset. Project is led by Diego:
https://github.com/dmarcos/aframe-motion-capture
http://swimminglessonsformodernlife.com/aframe-motion-capture/examples/
This will become the common way to develop for VR without having to enter and re-enter VR to test each code change, and to do it on the go

Play a sound on my computer when visitor clicks a button. {Collaboration ideas}

I have a room full of sales representatives. Currently, when they make a sale, they write it on the board and ring a bell next to the board. Lately we've been too busy for them to get up and ring the bell. I would like to have a small window or application open on their computer that has a button and then I have a separate window or application open that when they click the button it makes a bell sound. I do not want this to play on the other reps computers - just mine. Does anyone have any ideas on how this can be done?
Yes! You can make a new dummy Facebook account, and the 'button' will be entering anything in the private messages, so you will hear a sound there. However, you will have to avoid using other Facebook accounts on the same computer.

Triggering Windows Store background task from Band tile opened events

Is there a good way to trigger a Windows Store background task when a band tile is opened? And are there any examples for working with the band from a background task with the latest SDK? I have seen mentions of the ability to do so but can't find any code examples of this.
I have a scenario where the tile's content is only valid for a short time (~30 seconds) and would like to wake up a background service on the phone while the band's tile is open to update the content as needed.
I was hoping to find an IBackgroundTrigger in the SDK that would do the trick but no luck there. The best I can think of to fill this need would be to have a task that uses a system trigger and hooks up listeners for the tile opened/closed events. This seems like a lot of unnecessary work for the task though and could end up with unnecessary battery usage on the phone.
Thanks,
Tony
I'm afraid that is not possible. As far as I can see from the SDK, the app on the phone has to be running to allow a tile on the band to send an event back to the phone.
An alternative would be to open your app from the Band using a voice command. Would that solve your problem?

Can we used the iOS technologies in apple watch app?

I want to create music app in which the watch extension app shows audio wave so my question is Can we used the iOS technologies like openGL in watch app?
You can't run any code on the watch. You can only run code in a Watch extension in your iOS app and update a relatively static UI on the watch. You could generate images in your extension for the audio wave, put them together into an animation and then update the UI with that.
It would be possible to pass some information from your iOS app to the Watch extension running on the phone, which could then update a pre-defined interface on the Watch app. However, if you are wanting to provide a real-time audio waveform, I think this could face major problems regarding latency.
Note that as Stephen Johnson states, you could only do this by rendering static images which would then be sent to the watch for display, or by having pre-installed images in your watch interface that you rapidly show or hide to give the impression of levels changing. The latter would be a much more promising approach from a latency approach, and given Apple give a demonstration of a circular progress indicator made up of 360 images, perhaps it would appear to animate smoothly even. However, the key question would be whether the peaks would appear on the Watch screen close enough to when they actually occurred in the music that the user would see them as being linked.
It might be possible to pre-process the audio and build in a delay to both the display of the peaks and the audio playback to manage the communication latency—but testing that would really only be possible once you had Watch hardware in your hand.

Resources