Spoiler alert – I think iOS 11 will be very significant, and positive for iOS music producers.
This article aims to agglomerate some of the released aspects to help us music producers prepare for its release, such that workflows and lives and forum whinging can be minimized.
I’m going to arrange these in sections – for reference, and given varying workflows and significance of iOS 11 to them. Please add in whatever you know to the Comments Section and I’ll add it in and Credit you.
One other note – this article is light on photos – You’ll find plenty out there, and let me know if you come across any that are open-sourced so I can share and incorporate them here.
AUAudioUnit – Preferred View Configuration (AU GUI Sizes)
Prior to iOS 11, when an Audio Unit is loaded into its Host app, the Host app decides how to display the UI of the hosted app, and the AU is responsible for accommodating the Host’s choice. However, there is no standard size defined, and so we have an impossible choice to make. But since there are relatively few iOS AU Host apps, a common GUI size is used. AUM is the only app, to my knowledge, with a resizable GUI for hosted AU’s. iOS 11 solves the issue with Preferred View Configurations. The host app, presents to its hosted AU an array of sizes it will allow. Then the AU responds by telling the host which of its available views the AU supports. Lastly, the host decides which of the views the AU has sent, is best suited for its hosting area. It will be interesting to see how AU Preferred Views pan out across the several iOS AU host apps. Personally, I’d like a bit larger AU GUI than is currently the norm.
This is a great feature. AU’s can now emit MIDI, with the added bonus of being synced to their audio output. The Host app can also record/edit both MIDI/Audio output from the AU. The Host can call the musicalContextBlock property each render cycle. the musicalContextBlock allows the AU to send its hot tempo (BPM), Time Signature, Beat Position (of the beginning of the currently rendered buffer), and then the number of samples between the beginning of the buffer, and the net beat (can be 0). Lastly the AU can provide its beat position corresponding to the beginning of the current measure. This should be a terrific workflow enhancement – again, contingent on AU Host implementation. Currently in iOS there already exists the ability of Hosts to record AU parameter automations, however, no Host app has implemented the feature. **Update – BM3 has implemented AU parameter automation**
This is a feature I’m very excited about. IDAM allows you to connect your iOS device to your macOS computer with the lightning-to-USB cable. You Enable IDAM via Audio MIDI Setup. IDAM, prior to iOS 11’s release, is a 2 Channel 48khz audio pipe from iOS to OS. IDAM, with iOS 11 will get a Class Compliant Audio & MIDI addition. Class Compliant with USB Audio 2.0 (Note this is not USB 2.0), and with USB MIDI Class Compliance 1.0. So now when IDAM is connected the iOS device becomes a Class Compliant USB MIDI Host, Receiver, or both. Class Compliance means 16 MIDI Channels between OS and iOS, and as well a separate MIDI Host Port. As previously, sync and charge will still work with IDAM enabled. Your iOS device, in addition to the 16 MIDI Channels, should also list itself as a MIDI input/Output on your OS Daw or anything with send/receive MIDI.
If you look at the picture from the top of my post, you’ll notice the iPad now appears as a MIDI Device — does this mean iOS devices could be Native Control Surfaces in Desktop Daw’s like Ableton Live? – And as well, could Desktop Control surfaces be slaved to your iPad or iPhone? There are quite a few exciting possibilities here!
The new Files app coming in iOS 11 will likely significantly change your workflow, and probably for the better. Files is a native app, and replaces the iCloud app. Files allows you to quickly browse all your Cloud Drives, iPad accessible file locations, and your recently deleted files, all from one handy list. the Files GUI also shows up in the Document Picker on apps like AudioShare as well, so apps with Document Picker access will quickly connect to Files. It’s questionable how well (at this point) the Files app will be received. Once some apps implement Files features, then it could be successful. Initially, it may take more flack.
Drag and Drop
Another highly anticipated feature – allowing you to select and drag multiple files between applications, and between apps with support for Split Screen.
is particularly good. on ’11 Dragging up from the bottom of the screen, even when apps are launched, brings up the TaskBar – allowing you to Switch between apps without having to Double Tap the Home Button, and then scroll through currently open apps, or Tap them open from the Screen. Once you’ve Dragged Up from the bottom of the screen, you can then open the Apps in your Taskbar, or Drag an app in the Taskbar Up, and it will open in Split Screen beside your currently opened App. The Taskbar is no longer limited to 5-6 apps. I’m not sure how many apps you are allowed to have in the Taskbar, I’ve seen pictures with at least 12. There’s also a thin vertical line on the Taskbar, separating your assigned taskbar apps, from your Recently Opened Apps, the pictures I’ve seen show 3 Recently Opened Apps. The option to show Recently Opened Apps can be toggled on/off from Settings.
The functionality seems much improved, as you no longer have to drag over from the side, and then scroll through the supported apps, or minimize the most recently opened split screen app, before accessing the list. So now if you want to throw up a MIDI monitoring app to see what’s going on with your currently opened app, you just drag up from the bottom of the screen, then drag the MIDI Monitoring app up. Resizing the apps opened in Split Screen also seems easier, and you can Minimize the apps together in Split screen, such that opening them back up from a minimized state restores both as you left them. And presumably, you’ll be able to drag and drop files between 2 apps while they are opened in Split Screen. Hopefully apps will support this, such that samples, presets and the like could quickly be opened between compatible apps just by dragging and dropping them into the apps’ available File Folders.
The Control Center receives a major makeover – it’s where you’ll access the new Screen Recording function, Camera, Flashlight, and other customizable options. There’s even 2 Faders here to quickly adjust Screen Brightness and Volume, rather than having to drill into Settings for Access. It also displays, just like on OS, all your opened App Windows, allowing you to Delete or open them. This is far easier on the eyes, as you don’t have to Swipe through a bunch of tabs to open what you want, but rather can see them all at a glance.
Already Apple has reduced the price of iCloud Storage, with a 200GB per month subscription coming in at $2.99. iCloud Storage is also sharable with family memberships. The Files app makes working with iCloud much easier.
The App Store receives a major overhaul as well. Music, and other categories, now have their own page, with Rankings for Paid/Unpaid Music Apps. This alone, gives a much more Pro feel to the Music ecosystem on iOS. the search functionality is also changed with IAP’s now being searchable. Also, inexact searches will pull up more results now that are similar to your search query. Lastly, you can now share your Playlists with Friends and create an Apple Music Profile.
This looks to be a great feature. Screen Recording, from what I’ve read, looks to only capture Microphone Audio – I’m hopefully wrong here, as being able to combine multiple sound sources would be ideal. I’m also curious if there will be a way to record and playback touch options natively, rather than adding Call-Outs from a third-party editing app? Another curiosity is Privacy type issues – can you screen record a YouTube Video – or your Company’s internal video conference call? Screen Recording has to be initially enabled in Settings–>Control Center.
One Notable Bluetooth feature in iOS 11 – you can now disconnect individual Bluetooth Connections, rather than having to Turn Bluetooth on/off to do so.
Audio File Formats
iOS 11 Supports FLAC, and several other new formats:
Spatial Audio B-Format (PCM, .caf, B-format:W,X,Y,Z
Higher Order Ambisonics (N order ambisonics (N is 1..254, SN3D
ACN (Ambisonic Channel Number) Channels
Conversion between B-format, ACN_SN3D, ACN_N3D, from ambisonics to arbitrary speaker layout
Spatial Mixer: Head-Related Transfer Function (HRTF), (AUSpatialMixer), AVAudioEnvironmentNode), features – Better frequency response, better localization of sources in a 3D space).
AUGraph (2018) –
Links to Official Apple statements on iOS 11 Audio & Related Features
AuV3 Document Revision (Updates 6-6-2017) – Some documentation of new AuV3 features mostly not discussed in this article):
Audio Unit – Supporting Parameter Automation – To get a better understanding of how Audio Units and their hosts work together:
AVAudioEngine – Manual Rendering
This feature, new to iOS 11, allows for the Core Audio Engine to Process Audio in both offline, and Realtime modes. The advantage of having offline rendering, is the rendering can take place at a lower priority than required when processing audio in Realtime. This should allow developers to incorporate more powerful audio algorithms for processing audio.
AirPlay 2 Support
I don’t know too much about this feature, which will be new to iOS, tvOS and macOS. Here are some aspects:
- Multi-room audio with AirPlay 2 capable devices
- Long-form audio applications
- Content – music, podcasts etc.
- Separate, shared audio route to AirPlay 2 devices
- New AVAudioSession API for an application to identify itself as long-form
My basic understanding is that with AirPlay 2 devices, if you’re listening to music from iTunes, for example, if a phone call comes in, iTunes will continue playing to your AirPlay 2 speakers, while the phone call will use the System Audio of your i-device. Music apps use system audio.
I’d be curious if anyone knows more about AirPlay 2 Support. Will these devices be compatible with AirPlay 1 devices? What happens in a mixed device environment?
Well, that’s a good enough start. Let me know what other categories, features, and other tidbits you’ve learned about iOS 11! One last note, any features here are subject to change prior to the official iOS 11 release. Mixing down tracks would be a good example of “Post-Processing” that could benefit. And users will also benefit as CPU resources will improve.
Caching Service & Shared Internet:
If you are running macOS High Sierra and have your iOS device connected via Lightning-to-USB, you can Navigate to System Preferences–>Sharing–>Content Caching. There are 2 options here
- Cache iCloud Content – This is the tethered-caching feature, previously only available in macOS Server & more recently via command line tools. With High Sierra it’s native. If the service is enabled, and your device is verified to your Mac via iTunes, enabling the service has a few benefits. Any iOS apps you download, or iCloud books etc, via the iTunes store on the Mac, and system updates will be Cached to your Mac’s hard drive. They are then delivered to the iOS device via USB, which is much faster than the over-the-air method. Additionally, if you have multiple devices for a family, all of their updates will also be served from the Caching Service. This saves download bandwidth, speeds up installs, and of course Caches the updates, facilitating a more secure and reliable backup of your devices. Oh, one point I forgot to mention. The caching service also requires your Mac to be connected via Ethernet.
- Internet Sharing – This is a great feature. When you’re connected via Lightning-to-USB, enabling Internet Sharing shares the Mac’s Internet connection over USB. In my tests, I was able to get 70mbps downloads on my iPad, which is my home internet’s maximum speed. And while you can also use Ethernet from iOS with some dongles, it’s nice to have that solid connection using Lightning-to-USB.
**One Note – if you enable IDAM, both the Caching Service and Shared Internet service will be disabled. They will automatically re-enable when the IDAM connection is disconnected.