I use an iPad Pro 9.7 as a Recording Studio. I Know that the screen is touch sensitive and have noticed that the only synth apps that seem to use that feature are iFretless Bass and iFretless Guitar. The Piano apps I use say they have touch sensitivity but don’t on the touch screen. If I use an external keyboard to run the apps on then voila, touch sensitivity. Could I have somewhere disabled the feature? I’m a huge fan of MIDI and am constantly running across people that say MIDI never sounds real. I tell them that MIDI is amazing and their interpretation comes from having only heard bad MIDI. I tell them that everything within a performance is captured by the MIDI data and that it is so incredible that it completely and accurately records and played back everything that has been played. I would love to be able to record performances directly from the iPad so I could illustrate the power and don’t really want to have to hook an external keyboard up to access the touch sensitive aspects of a performance. Could someone point me in a good direction to access the touch sensitivity with just the iPad?
Only iPhones have 3D touch; the iPad screen is too large for that technology. Touch sensitivity on the iPad requires some other kind of sensor, such as the Apple Pen.
The iFretless web page says:
A patented algorithm uses the accelerometer accurately detects touch pressure on the screen
Thank you for your response. I am baffled then by the iFretless Bass and iFretless Guitar, two iPad apps that have used velocity control. If I touch the screen lightly then the tone is that of a lightly plucked string. Tapping with more force delivers a completely different sound from the app. My question I guess should be if the technology is there to have touch sensitivity on a music app, then why is it not being utilized?
Apparently, it's patented. And in any case not built into iOS, so not easy to implement.
They don't link to the patent, but if it's monophonic, processing the accelerometer (when the devices is on a squishy surface) can pretty obviously get you something like velocity. The touch information that iOS provides also has a "finger radius", which can indicate how hard it's being pressed on the screen.
I use this a little in my app PolyHarp, but it's a pretty subtle effect.
. Theres two main touch technologies the modern iphones use. Apples "3D touch" which provides a functionality to recognize "force touch" gestures. Its not really a "how hard is this press" thing, Its more a forceful touch vs regular touch thing, although I believe a more nuanced reading must be possible on account of art apps that recognize pressure. Theres also the haptic technology that provides a degree of feedback to certain touches via a small motor mounted under the touch screen that viabrates. You know the fingerprint reading button on the pre face recognition phones? Totally not a button, its an illusion caused by haptic thats so realistic its almost impossible to convince your brain the "button" isnt depressing. But thats a little off topic.
The amount of apps that use force touch isnt' high, partly because its not clear apple will continue to support it. It was removed from the latest iPhone XRs leading a lot of developers to suspect its being phased out. Honestly for the uses apple where using it for, it was bit of a gimmick. and yeah, it was never successfully implemented on the iPad