Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

Posts under General subtopic

Post

Replies

Boosts

Views

Activity

Can individual Apple Developer accounts stream full tracks with MusicKit?
I have implemented fetching Apple Music preview songs using a Swift framework integrated into a Unity app. My requirement is to fetch full tracks from a user’s Apple Music library and play them inside Unity. To do this, I understand that I need to handle authentication, generate a Developer Token, and then obtain a Music User Token to access the user’s Apple Music content. Currently, I have an Individual Apple Developer account (not Organization). Based on my research, it seems that: With an Individual account, I can implement this functionality and even upload builds to TestFlight for internal testing. However, when releasing the app publicly on the App Store, full-track playback may be restricted for Individual accounts and allowed only for Organization accounts. 👉 Can you confirm if this understanding is correct? 👉 Specifically, is it possible for an Individual account to fetch and play full-length tracks from a subscribed Apple Music user’s library (at least for internal/TestFlight testing)?
0
0
144
3d
Apple Music API: Adding To Collaborative playlist gives 500 error
I am using https://developer.apple.com/documentation/applemusicapi/add-tracks-to-a-library-playlist to add tracks to playlists. This endpoint works fine for all playlists except for collaborative playlists. For collaborative playlist I get the following 500 error as a response: "errors": [ { "id": "<some id>", "title": "Upstream Service Error", "detail": "Unable to update tracks", "status": "500", "code": "50001" } ] } Steps to reproduce: Create a playlist in your library. Use the api to add a song. Confirm that it works. Make that same playlist collaborative. Update the playlist ID in your api request (as making a playlist collaborative changes its id) Confirm that you get the 500 error.
2
0
462
6d
donate INPlayMediaIntent to systerm, but not show in control center
I donate INPlayMediaIntent to systerm(donate success), but not show in control center My code is as follows let mediaItems = mediaItems.map { $0.inMediaItem } let intent = if #available(iOS 13.0, *) { INPlayMediaIntent(mediaItems: mediaItems, mediaContainer: nil, playShuffled: false, playbackRepeatMode: .none, resumePlayback: true, playbackQueueLocation: .now, playbackSpeed: nil, mediaSearch: nil) } else { INPlayMediaIntent(mediaItems: mediaItems, mediaContainer: nil, playShuffled: false, playbackRepeatMode: .none, resumePlayback: true) } intent.suggestedInvocationPhrase = "播放音乐" let interaction = INInteraction(intent: intent, response: nil) interaction.donate { error in if let error = error { print("Intent 捐赠失败: \(error.localizedDescription)") } else { print("Intent 捐赠成功 ✅") } }
2
0
74
1w
Use MusicKit's User Library Artists with Catalog Artists?
When making a call to https://api.music.apple.com/v1/me/library/artists to get a user's library artists, it returns the following (as an example): [ { id: 'r.FCwruQb', type: 'library-artists', href: '/v1/me/library/artists/r.FCwruQb?l=en-US', attributes: { name: 'A Great Big World' } }, { id: 'r.7VSWOgj', type: 'library-artists', href: '/v1/me/library/artists/r.7VSWOgj?l=en-US', attributes: { name: 'Aaliyah' } }, ... ] If I try and use an artist id from that retuned data to look up additional information about the artist by calling https://api.music.apple.com/v1/catalog/us/artists/{id}, it fails. User Library Artists don't seem to equal Catalog Artists. It'd be great if there was a way to use these interchangeably. Am I missing something?
0
0
227
1w
BackgroundAssets `url(for:` throws error for locally available asset
On an iPhone running iOS 26 beta 5, url(for: FilePath("subdir/asset.mov")) most always throws this error: The URL for “subdir/asset.mov” couldn’t be retrieved: “asset.mov” couldn’t be copied to “subdir” because an item with the same name already exists. Yet, contents(at: FilePath("subdir/asset.mov")) always returns Data for a playable AVMovie. How can I avoid this url(for:) error? The asset pack in question is downloaded. The error persists even after pack deletion, redownload, relaunch, and combinations of that. // Assets repo root subdir.aar subdir/asset.mov subdir/asset_thumb.heic subdir/Manifest.json // Manifest.json { "assetPackID": "subdir", "downloadPolicy": { "onDemand": {} }, "fileSelectors": [ { "directory": "subdir", }, ], "platforms": [ "iOS", "visionOS" ] } xcrun ba-package subdir/Manifest.json -o subdir.aar xcrun ba-serve --host 192.168.0.10 -p 443 subdir.aar
2
0
133
3w
Why does CADisplayLink of an external UIScreen drift in time?
I am using Apple's original Lightning Digital AV-adapter (Lightning-to-HDMI dongle) to connect my iPhone to an external display via a HDMI cable. I need to synchronize rendering with the external display's refresh rate, so I create a new CADisplayLink tied to the external display's UIScreen: UIScreen.screens[externalDisplayIdx].displayLink(withTarget:, selector:). The callback is being called regularly, but with increasing delay relative to the CADisplayLink.timestamp, so the next time the callback is called, I have less and less time to draw the next frame (see the snippet below). Assuming 60 FPS, the value of secondsTillDeadline starts at an arbitrary value in the range of approx -0.0001 to 0.0166667, and then it slowly decreases towards zero (and for a brief period it goes into small negative numbers). Once it reaches zero, it flips back to 0.0166667 and continues to decrease again. This cycle repeats indefinitely. Changing the external display's resolution (UIScreen's mode) or the CADisplayLink's preferredFrameRateRange to a lower FPS does not seem to have any effect on the temporal drifting (even the rate of change seem to be the same). When I create a new CADisplayLink for the iPhone's main screen, the value of secondsTillDeadline is stable, it does not drift and it is very close to 0.0166667, as expected. Is this drift caused by the external monitor or by Apple's Lightning-to-HDMI dongle ...or is the problem somewhere else? Can the drifting be stopped? func onDisplayLinkUpdate(displayLink: CADisplayLink) { // Gradually decreases from 0.01667 to -0.0001, then flips back to 0.01667 and continues to decrease let secondsTillDeadline = displayLink.targetTimestamp - CACurrentMediaTime() }
3
0
182
Jul ’25
Access to favorited artists from Music API?
It's been well over a year since Apple added favoriting of artists back to Apple Music (the little star icon on an artist page), but yet I still haven't seen a way to get this data from an authenticated user from Music API. I was expecting to hear something about this during the WWDC, but there have been no announcements that I've caught. Has anyone else heard anything? People assume when they provide access to their Apple Music account that we can actually get to the data in their Apple Music account, and we end up looking a little dumb not being able to get this core data.
0
1
262
Jul ’25
Broadcast Upload Extension - Screen Sharing Fails to Start After Countdown (No Errors Logged)
Hello everyone, I'm working on implementing a screen sharing feature using RPSystemBroadcastPickerView and a Broadcast Upload Extension to share the entire app screen in an iOS application. The Broadcast Upload Extension is set up following Apple's ReplayKit guidelines. However, I’m encountering an issue during the broadcast startup sequence: ❗ Problem Description The Screen Broadcast UI appears as expected I tap “Start Broadcast” The countdown (3 → 2 → 1) completes Then it immediately reverts to the "Start Broadcast" screen, and screen sharing does not begin No error messages are displayed None of the extension lifecycle methods (broadcastStarted(withSetupInfo:), processSampleBuffer, etc.) are called There are no logs or crash reports, neither in the main app nor in the extension ✅ What Has Been Verified Info.plist of the Broadcast Upload Extension includes: NSExtensionPointIdentifier = com.apple.broadcast-services-upload NSExtensionPrincipalClass set correctly RPBroadcastProcessMode = RPBroadcastProcessModeSampleBuffer preferredExtension is set properly to the extension’s bundle identifier Extension is listed in the main app's build settings under "Frameworks, Libraries, and Embedded Content" ⚠️ Additional Concern We noticed that in Xcode (latest version), the Broadcast Upload Extension is listed under "Embedded Frameworks" with the setting "Embed Without Signing", and there is no option to change it to "Embed & Sign". We're wondering if this could be the reason the extension fails to launch correctly at runtime, despite being detected by the broadcast picker. ❓ Questions Has anyone faced similar issues where the broadcast never starts despite correct setup? Could the "Embed Without Signing" be causing the system to silently cancel or ignore the extension at runtime? Are there any provisioning profile or entitlement requirements specific to Broadcast Upload Extensions that might trigger this behavior silently? Any insights, suggestions, or workarounds would be greatly appreciated. Thank you in advance!
0
0
138
Jul ’25
currentPlaybackPitch or preservesPitch parameter
Hi Apple Music API / MusicKit / MediaPlayer Team, Similar to the currentPlaybackRate keeps the same pitch, it would be great to have a currentPlaybackPitch parameter as well. Alternatively, adding a preservesPitch parameter would also work. I see that iOS 26 AutoMix on Apple Music currently does pitch shifting during music transitions, so maybe this is something that could be exposed on the later betas of iOS 26? Main feature request we get is to have simple pitch changes to Apple Music we play through our app. Is this being considered?
0
0
390
Jul ’25
AVSpeechSynthesizer read Mandarin as Cantonese(iOS 26 beta 3))
In iOS 26, AVSpeechSynthesizer read Mandarin into Cantonese pronunciation. No matter how you set the language, and change the settings of my phone system, it doesn't work. let utterance = AVSpeechUtterance(string: "你好啊") //let voice = AVSpeechSynthesisVoice(language: "zh-CN") // not work let voice = AVSpeechSynthesisVoice(language: "zh-Hans") // not work too utterance.voice = voice et synth = AVSpeechSynthesizer() synth.speak(utterance)
1
0
184
Jul ’25
Apple Music / MusicKit and Simulator
It's been an ask for a few years and I'm wondering if there are any plans, or whether the '26 SDKs/Tools allow Apple Music to work in the simulator? I develop for the Vision Pro so the usual 'fix' of running on the device is a bit of a hard ask. At the very least a small sample library that works in the simulator would be welcome (similar to how photos works) Cheers
1
1
115
Jul ’25
MusicKit API returns 500 Internal Server Error despite valid JWT and setup
My app is properly configured with MusicKit. I've generated a JWT using my valid credentials (Team ID, Key ID, private key), and I’ve ensured the time settings are correct via NTP. When I call: https://api.music.apple.com/v1/catalog/jp/search?term=ado&amp;types=songs I consistently receive a 500 Internal Server Error. The JWT is generated using ES256 with valid iat and exp values. I’ve confirmed the token decodes properly using jwt.io, and it's passed via the Authorization: Bearer header. Things I’ve confirmed: Key ID, Team ID, private key are correct App ID is configured with MusicKit capability JWT is generated and signed correctly macOS time is synced via NTP Used both curl and Python to test — same result Is there anything else I should check on the Apple Developer Console (like App ID, Certificates, or provisioning profile)? Or could this be a backend issue on Apple’s side? Any guidance would be appreciated.
3
0
220
Jul ’25
Personas and Lighting in a visionOS 26 (or earlier) SharePlay Session
Use case: When SharePlay -ing a fully immersive 3D scene (e.g. a virtual stage), I would like to shine lights on specific Personas, so they show up brighter when someone in the scene is recording the feed (think a camera person in the scene wearing Vision Pro). Note: This spotlight effect only needs to render in the camera person's headset and does NOT need to be journaled or shared. Before I dive into this, my technical question: Can environmental and/or scene lighting affect Persona brightness in a SharePlay? If not, is there a way to programmatically make Personas "brighter" when recording? My screen recordings always seem to turn out darker than what's rendered in environment, and manually adjusting the contrast tends to blow out the details in a Persona's face (especially in visionOS 26).
0
0
57
Jun ’25
ffmpeg xcframework not working on Mac, but working correctly on iOS
I have an app (currently in development stage) which needs to use ffmpeg, so I tried searching how to embed ffmpeg in apple apps and found this article https://doc.qt.io/qt-6/qtmultimedia-building-ffmpeg-ios.html It is working correctly for iOS but not for macOS ( I have made changes macOS specific using chatgpt and traditional web searching) Drive link for the file and instructions which I'm following: https://drive.google.com/drive/folders/11wqlvb8SU2thMSfII4_Xm3Kc2fPSCZed?usp=share_link Please can someone from apple or in general help me to figure out what I'm doing wrong?
1
0
100
Jun ’25
Apple Music API High Error Rate
I am getting high error rates from the Apple Music API. This has been happening for months now, and it is quite frustrating. It is a mix of 404, 504, and random 500 errors. I hit these endpoints all of the time, so it is not like I am hitting a resource that doesn't exist. Why is this happening? Is this a known issue that is getting worked on?
0
0
60
Jun ’25
Apple Music API Library Playlist Images
Hi, I'm sending an API request to: https://api.music.apple.com/v1/me/library/playlists?limit=$limit&offset=$offset To list all of the users library playlists, however the resulting objects do not contain the playlist artwork in the JSON. I've tried adding the extend and include attributes as well but to no avail. A partial example of the response: {"id": PLAYLIST_ID, "type": "library-playlists", "href": "/v1/me/library/playlists/PLAYLIST_ID", "attributes": {"lastModifiedDate": "2024-09-18T20:18:24Z", "canEdit": true, "name": "Afro Party Anthems", "isPublic": false, "description": {"standard": "Definitive African party starters"}, "hasCatalog": false, "dateAdded": "2022-03-10T18:30:56Z", "playParams": {"id": PLAYLIST_ID, "kind": "playlist", "isLibrary": true}}, "relationships": {"catalog": {"href": "/v1/me/library/playlists/PLAYLIST_ID/catalog", "data": []}}} Is there a way to get the artwork URL without sending a request for each playlist? And if not can this be fixed?
0
1
69
Jun ’25
_MediaPlayer_AppIntents compilation error for iOS 26
getting an interesting error attempting to compile my app in Xcode 26 beta. error: Unable to find module dependency: '_MediaPlayer_AppIntents' (in target 'icatcher' from project 'icatcher') note: A dependency of main module 'MainModuleCrossImportOverlays' (in target 'icatcher' from project 'icatcher') Unable to find module dependency: '_MediaPlayer_AppIntents' Not sure what to try and pull to fix this issue
1
1
89
Jun ’25
[CoreImage] OS 26 breaks Metal kernels for CIFilters
I maintain a couple of CoreImage libraries that provide custom Metal kernel backed CIFilters. In iOS/iPadOS 26, the CIColorKernel.apply() method invoked in the CIFilter subclass fails to add the coreimage::destination parameter to the Metal function call: -[CIColorKernel applyWithExtent:arguments:options:] argument count mismatch for kernel 'FractalNoise3D', expected 13 but saw 12. I've compiled the code with Xcode 26 and deployed to iOS 18 devices without any breakage, so this is definitely an iOS problem, not an Xcode problem. Library here: https://github.com/JoshuaSullivan/SimplexNoiseFilter Feedback ID: FB17874311
1
0
97
Jun ’25