Jailbreak and overclock ipad cpu - jailbreak

Its a pretty simple question. How can I jailbreak and overclock my Ipad 3rd gen. Apple has a system in place to down clock the cpu as the battery gets older. And its gotten pretty bad (20 minute load time on web pages and apps not functioning bad). I know that Ill have to tether it to a wall 24/7, but thats better that how it functions now.

The iPad does not have the iOS throttle feature, only iPhones have them. But all smartphones and tablets have a hardware throttle feature. Your battery is the cause of the slowdown. The PMIC (internal host device) regulates voltages and clock signal generators. This is controlled by its own firmware separate from iOS, usually MaskROM (not modifiable). This prevents damage to the CPU, BaseBand processor etc. when you put load on a faulty battery, its supply voltage will drop during this peak load, leading to a crash or reboot. The PMIC is filtering this away, and in case of an bad battery, it has to lower performance to prefent crashes etc
Bottom line, you need a new battery. A electric car won't run as fast on a near empty battery as it would on a full battery

Related

ESP32-S2 QT PY Deep Sleep Wakeup

I currently have an ESP32-S2 QT Py that I am using to send sensor data to a PC via wifi using TCP. I want to preserve battery life while not sensing, so I put it into deep sleep. I see that there are timer, pin, and touch alarms to wake it back up, but none of these work well for my design. Is there any way that I can wake up a deep sleeping ESP32-S2 with a TCP packet send from the PC server?
This functionality is commonly called "wake on LAN" and is generally used, as you described, to allow a "magic" packet to wake up a sleeping or hibernating device.
In this case, no. The ESP32's wifi radio is off during deep sleep. With the radio off there's nothing active that can receive or detect a packet.
If you're trying to conserve power and maximize battery life, you need to keep the radio off as it uses quite a bit of power just to stay connected to a wifi network.
The only parts of the ESP32 that are active and available during deep sleep are the ULP ("Ultra Low Power") processor, a very limited slow processor that can perform a few I/O actions during sleep, and a small amount of static RAM associated with the real time clock. You can learn more about the ULP and what it can do in Espressif's documentation.

How to set Transmitting Speed vs Scanning Speeds for Altbeacon Library

The Altlibary is great at detection! One thing we noticed with testing is if we have an app doing both transmitting and receiving we are not picking up the other phones at times. (Very sporadic) With real devices like ibeacons we are constantly able to pick them up.
My question is how do we control the frequency of the transmitter vs the frequency of the scanning (recieving) so that we can both do transmission and detection at the same time?
My goal is to achieve the best of both worlds scanning and transmitting, is that even possible.
https://altbeacon.github.io/android-beacon-library/beacon-transmitter.html
By default, the Android Beacon Library's BeaconTransmitter uses the highest power and frequency allowed by the underlying APIs in the Android operating system. Here are the settings, showing the defaults:
beaconTransmitter.setAdvertiseTxPowerLevel(
AdvertiseSettings.ADVERTISE_TX_POWER_HIGH);
beaconTransmitter.setAdvertiseMode(
AdvertiseSettings.ADVERTISE_MODE_LOW_LATENCY);
While the settings are configurable, presumably you already want the fastest and strongest advertising for you use case. And that is exactly what the library does with no extra configuration. (Note: there is very little reason to lower the transmit power or frequency, because tests show that transmitters use negligible battery. See my blog post here: http://www.davidgyoungtech.com/2015/11/12/battery-friendly-beacon-transmission)
If you are seeing that hardware beacons are reliable, but some phone models' transmitters are not detected infrequently, then the issue may be hardware issues with those phones themselves. You may wish to characterize which ones are problematic.
I can confirm that I see very strong transmissions from the Pixel 3a, Moto G7, Samsung Galaxy S10 and Huawei P9 Lite I have handy.

What is the lowest latency communication method between a computer and a microcontroller?

I have a project in which I need to have the lowest latency possible (in the 1-100 microseconds range at best) for a communication between a computer (Windows + Linux + MacOSX) and a microcontroller (arduino or stm32 or anything).
I stress that not only it has to be fast, but with low latency (for example a fast communication to the moon will have a low latency).
For the moment the methods I have tried are serial over USB or HID packets over USB. I get results around a little less than a millisecond. My measurement method is a round trip communication, then divide by two. This is OK, but I would be much more happy to have something faster.
EDIT:
The question seems to be quite hard to answer. The best workaround I found is to synchronize clocks of the computer and microcontroler. Synchronization requires communication indeed. With the process below, dt is half a round trip, and sync is the difference between the clocks.
t = time()
write(ACK);
read(remotet)
dt = (time() - t) / 2
sync = time() - remotet - dt
Note that the imprecision of this synchronization is at most dt. The importance of the fastest communication channel stands, but I have an estimation of the precision.
Also note technicalities related to the difference of timestamp on different systems (us/ms based on epoch on Linux, ms/us since the MCU booted on Arduino).
Pay attention to the clock shift on Arduino. It is safer to synchronize often (every measure in my case).
USB Raw HID with hacked 8KHz poll rate (125us poll interval) combined with Teensy 3.2 (or above). Mouse overclockers have achieved 8KHz poll rate with low USB jitter, and Teensy 3.2 (Arduino clone) is able to do 8KHz poll rate with a slightly modified USB FTDI driver on the PC side.
Barring this, and you need even better, you're now looking at PCI-Express parallel ports, to do lower-latency signalling via digital pins directly to pins on the parallel port. They must be true parallel ports, and not through a USB layer. DOS apps on gigahertz-level PCs were tested to get sub-1us ability (1.4Ghz Pentium IV) with parallel port pin signalling, but if you write a virtual device driver, you can probably get sub-100us within Windows.
Use raised priority & critical sections out of the wazoo, preferably a non-garbage-collected language, minimum background apps, and essentially consume 100% of a CPU core on your critical loop, and you can definitely reliably achieve <100us. Not 100% of the time, but certainly in the territory of five-nines (and probably even far better than that). If you can tolerate such aberrations.
To answer the question, there are two low latency methods:
Serial or parallel port. It is possible to get latency down to the millisecond scale, although your performance may vary depending on manufacturer. One good brand is Brainboxes, although their cards can cost over $100!
Write your own driver. It should be possible to achieve latencies on the order of a few hundred micro seconds, although obviously the kernel can interrupt your process mid-way if it needs to serve something with a higher priority. This how a lot of scientific equipment actually works. (and a lot of the people telling you that a PC can't be made to work on short deadlines are wrong).
For info, I just ran some tests on a Windows 10 PC fitted with two dedicated PCIe parallel port cards.
Sending TTL (square wave) pulses out using Python code (actually using Psychopy Builder and Psychopy coder) the 2 channel osciloscope showed very consistant offsets between the two pulses of 4us to 8us.
This was when the python code was run at 'above normal' priority.
When run at normal priority it was mostly the same apart from a very occassional 30us gap, presumably when task switching took place)
In short, PCs aren't set up to handle that short of deadline. Even using a bare metal RTOS on an Intel Core series processor you end up with interrupt latency (how fast the processor can respond to interrupts) in the 2-3 µS range. (see http://www.intel.com/content/dam/www/public/us/en/documents/white-papers/industrial-solutions-real-time-performance-white-paper.pdf)
That's ignoring any sort of communication link like USB or ethernet (or other) that requires packetizing data, handshaking, buffering to avoid data loss, etc.
USB stacks are going to have latency, regardless of how fast the link is, because of buffering to avoid data loss. Same with ethernet. Really, any modern stack driver on a full blown OS isn't going to be capable of low latency because of what else is going on in the system and the need for robustness in the protocols.
If you have deadlines that are in the single digit of microseconds (or even in the millisecond range), you really need to do your real time processing on a microcontroller and have the slower control loop/visualization be handled by the host.
You have no guarantees about latency to userland without real time operating system. You're at the mercy of the kernel and it's slice time and preemption rules. Which could be higher than your maximum 100us.
In order for a workstation to respond to a hardware event you have to use interrupts and a device driver.
Your options are limited to interfaces that offer an IRQ:
Hardware serial/parallel port.
PCI
Some interface bridge on PCI.
Or. If you're into abusing IO, the soundcard.
USB is not one of them, it has a 1kHz polling rate.
Maybe Thunderbolt does, but I'm not sure about that.
Ethernet
Look for a board that has a gigabit ethernet port directly connected to the microcontroller, and connect it to the PC directly with a crossover cable.

Does a BLE device reads advertising packets when not scanning? (autoconnect)

I read in some places that advertising packets are sent to every one in the distance range. However, should the other device be scanning to receive them or it will receive it anyways?
The problem:
let's say I'm establishing a piconet between 5 or 6 BLE devices. At some point I have some connections between the slaves and one master. Then if one of the devices get removed/shut off for a few days I would like it to reconnect back to the network as soon as turned on.
I read about the autoconnect feature but it seems when you set it true, the device creates a background scanning which is actually slower (in frequency) than the manual scanning. This makes me conclude that for the autoConnect to work the device which is being turned on again needs to advertise again, right? Therefore, if autoconnect really runs a slow scan on background so it seems to me that you can never receive the adv packets instantly unless you're scanning somehow. Does that make sense?
If so, is there any way around it? I mean, detect the device that is comming back to the range instantly?
Nothing is "Instant". You are talking about radio protocols with delays, timeouts, retransmits, jamming, etc. There are always delays. The important thing is what you consider acceptable for your application.
A radio transceiver is either receiving, sleeping or transmitting, on one given channel at a time. Transmitting and receiving implies power consumption.
When a Central is idle (not handling any connection at all), all it has to do is scanning. It can do it full time (even if spec says this should be duty cycled). You can expect to actually receive an advertising packet from peer Peripheral the first time it is transmitted.
When a Central is maintaining a connection to multiple peripherals, its transceiver time is shared between all the connections to maintain. Background scanning is considered low priority, and takes some of the remaining transceiver time. Then an advertising Peripheral may send its ADV packet while Central is not listening.
Here comes statistical magic:
Spec says interval between two advertising events must be augmented with a (pseudo-)random delay. This ensures Central (scanner) and Peripheral (advertiser) will manage to see each other at some point in time. Without this random delay, their timing allocations could become harmonic, but out of phase, and it could happen they never see each other.
Depending on the parameters used on Central and Peripheral (advInterval, advDelay, scanWindow, scanInterval) and radio link quality, you can compute the probability to be able to reach a node after a given time. This is left as an exercise to the reader... :)
In the end, the question you should ask yourself looks like "is it acceptable my Peripheral is reconnected to my Central after 300 ms in 95% of cases" ?

Energy efficiency of Android sensors vs Bluetooth low energy sensors?

I am making an android application that requires me to detect the user's motion.
My application also requires me to use an external sensor, which is a Bluetooth smart sensor, for some other purposes.
Now I have two options:
to use the accelerometer and gyrometer of the android phone
fetch motion information from the bluetooth smart sensor.
I understand that Bluetooth Smart (ble) is more energy efficient than Bluetooth sensors.
However, I am confused as to which of the above options will provide me a more energy efficient solution on the Android device. EDIT: I am presently not concerned with the energy efficiency of the bluetooth device.
Also, please see that I want this comparison only because I don't want to detect accurate user motion, otherwise an external device(bluetooth low energy device) would have been better hands down.
The accelerometer and gyrometer will take roughly the same amount of power whether it's on the phone or the external device. The difference is the external device has to transmit that information over a radio signal to the phone. It makes more sense to just use the Phone's existing information if that's sufficient because it won't require any radio transmissions and will require less power. Also, the phone will have a much larger battery.
The sensors on the phone have nothing to do with Bluetooth... they're incorporated right into the hardware.
EDIT: The difference between Bluetooth and BLE is that BLE uses the radio much more sparingly. Radio transmissions take a good chunk of power. So, using on-board sensors is going to (most likely) take far less power than using a radio to communicate with an external sensor. Also, I have a feeling that the accelerometers are always on on the cellphone so getting readings from those is going to take no more power than is already being used, too.

Resources