ShazamKit

RSS for tag

Get exact audio matching for any audio source using the Shazam catalog or a custom catalog in an app.

ShazamKit Documentation

Posts under ShazamKit tag

15 Posts
Sort by:
Post not yet marked as solved
0 Replies
71 Views
I'm trying to expose my native shazamkit code to the host react native app. The implementation works fine in a separate swift project but it fails when I try to integrate it into a React Native app. Exception 'required condition is false: IsFormatSampleRateAndChannelCountValid(format)' was thrown while invoking exposed on target ShazamIOS with params ( 1682, 1683 ) callstack: ( 0 CoreFoundation 0x00007ff80049b761 __exceptionPreprocess + 242 1 libobjc.A.dylib 0x00007ff800063904 objc_exception_throw + 48 2 CoreFoundation 0x00007ff80049b56b +[NSException raise:format:] + 0 3 AVFAudio 0x00007ff846197929 _Z19AVAE_RaiseExceptionP8NSStringz + 156 4 AVFAudio 0x00007ff8461f2e90 _ZN17AUGraphNodeBaseV318CreateRecordingTapEmjP13AVAudioFormatU13block_pointerFvP16AVAudioPCMBufferP11AVAudioTimeE + 766 5 AVFAudio 0x00007ff84625f703 -[AVAudioNode installTapOnBus:bufferSize:format:block:] + 1456 6 muse 0x000000010a313dd0 $s4muse9ShazamIOSC6record33_35CC2309E4CA22278DC49D01D96C376ALLyyF + 496 7 muse 0x000000010a313210 $s4muse9ShazamIOSC5startyyF + 288 8 muse 0x000000010a312d03 $s4muse9ShazamIOSC7exposed_6rejectyyypSgXE_ySSSg_AGs5Error_pSgtXEtF + 83 9 muse 0x000000010a312e47 $s4muse9ShazamIOSC7exposed_6rejectyyypSgXE_ySSSg_AGs5Error_pSgtXEtFTo + 103 10 CoreFoundation 0x00007ff8004a238c __invoking___ + 140 11 CoreFoundation 0x00007ff80049f6b3 -[NSInvocation invoke] + 302 12 CoreFoundation 0x00007ff80049f923 -[NSInvocation invokeWithTarget:] + 70 13 muse 0x000000010a9210ef -[RCTModuleMethod invokeWithBridge:module:arguments:] + 2495 14 muse 0x000000010a925cb4 _ZN8facebook5reactL11invokeInnerEP9RCTBridgeP13RCTModuleDatajRKN5folly7dynamicEiN12_GLOBAL__N_117SchedulingContextE + 2036 15 muse 0x000000010a925305 _ZZN8facebook5react15RCTNativeModule6invokeEjON5folly7dynamicEiENK3$_0clEv + 133 16 muse 0x000000010a925279 ___ZN8facebook5react15RCTNativeModule6invokeEjON5folly7dynamicEi_block_invoke + 25 17 libdispatch.dylib 0x000000010e577747 _dispatch_call_block_and_release + 12 18 libdispatch.dylib 0x000000010e5789f7 _dispatch_client_callout + 8 19 libdispatch.dylib 0x000000010e5808c9 _dispatch_lane_serial_drain + 1127 20 libdispatch.dylib 0x000000010e581665 _dispatch_lane_invoke + 441 21 libdispatch.dylib 0x000000010e58e76e _dispatch_root_queue_drain_deferred_wlh + 318 22 libdispatch.dylib 0x000000010e58db69 _dispatch_workloop_worker_thread + 590 23 libsystem_pthread.dylib 0x000000010da67b84 _pthread_wqthread + 327 24 libsystem_pthread.dylib 0x000000010da66acf start_wqthread + 15 ) RCTFatal facebook::react::invokeInner(RCTBridge*, RCTModuleData*, unsigned int, folly::dynamic const&, int, (anonymous namespace)::SchedulingContext) facebook::react::RCTNativeModule::invoke(unsigned int, folly::dynamic&&, int)::$_0::operator()() const invocation function for block in facebook::react::RCTNativeModule::invoke(unsigned int, folly::dynamic&&, int) This is my swift file, error happens in the record function. import Foundation import ShazamKit @objc(ShazamIOS) class ShazamIOS : NSObject { @Published var matching: Bool = false @Published var mediaItem: SHMatchedMediaItem? @Published var error: Error? { didSet { hasError = error != nil } } @Published var hasError: Bool = false private lazy var audioSession: AVAudioSession = .sharedInstance() private lazy var session: SHSession = .init() private lazy var audioEngine: AVAudioEngine = .init() private lazy var inputNode = self.audioEngine.inputNode private lazy var bus: AVAudioNodeBus = 0 override init() { super.init() session.delegate = self } @objc func exposed(_ resolve:RCTPromiseResolveBlock, reject:RCTPromiseRejectBlock){ start() resolve("ios code executed") } func start() { switch audioSession.recordPermission { case .granted: self.record() case .denied: DispatchQueue.main.async { self.error = ShazamError.recordDenied } case .undetermined: audioSession.requestRecordPermission { granted in DispatchQueue.main.async { if granted { self.record() } else { self.error = ShazamError.recordDenied } } } @unknown default: DispatchQueue.main.async { self.error = ShazamError.unknown } } } private func record() { do { self.matching = true let format = self.inputNode.outputFormat(forBus: bus) self.inputNode.installTap(onBus: bus, bufferSize: 8192, format: format) { [weak self] (buffer, time) in self?.session.matchStreamingBuffer(buffer, at: time) } self.audioEngine.prepare() try self.audioEngine.start() } catch { self.error = error } } func stop() { self.audioEngine.stop() self.inputNode.removeTap(onBus: bus) self.matching = false } @objc static func requiresMainQueueSetup() -> Bool { return true; } } extension ShazamIOS: SHSessionDelegate { func session(_ session: SHSession, didFind match: SHMatch) { DispatchQueue.main.async { [self] in if let mediaItem = match.mediaItems.first { self.mediaItem = mediaItem self.stop() } } } func session(_ session: SHSession, didNotFindMatchFor signature: SHSignature, error: Error?) { DispatchQueue.main.async {[self] in self.error = error self.stop() } } } objC file #import <Foundation/Foundation.h> #import "React/RCTBridgeModule.h" @interface RCT_EXTERN_MODULE(ShazamIOS, NSObject); RCT_EXTERN_METHOD(exposed:(RCTPromiseResolveBlock)resolve reject:(RCTPromiseRejectBlock)reject) @end how I consume the exposed function in RN. const {ShazamModule, ShazamIOS} = NativeModules; const onPressIOSButton = () => { ShazamIOS.exposed().then(result => console.log(result)).catch(e => console.log(e.message, e.code)); };
Posted Last updated
.
Post not yet marked as solved
0 Replies
206 Views
Hi Apple developers, Is it possible to display lyrics synchronized with a song in my app after the song has been identified? Just like the feature available in the general Shazam app when clicking on the middle-top icon (with music note in it) after Shazam'ing a song. The app I'm developing is intended for the Deaf and hard-of-hearing and therefore I would love to be able to show the song lyrics to make the app accessible. I would greatly appreciate your help because I can't find this in the documentation. Many thanks in advance!
Posted Last updated
.
Post not yet marked as solved
0 Replies
293 Views
Hello, I have a music on apple music. When I search this music on Shazam, I want it to appear with a clip like the link I provided below. Is there any way you can help with this? Example: https://www.youtube.com/watch?v=St8smx2q1Ho My Music: https://music.apple.com/us/album/tam-ba%C4%9F%C4%B1ms%C4%B1z-t%C3%BCrkiye/1689395789?i=1689395790 Thanks.
Posted
by yasirb_.
Last updated
.
Post marked as solved
1 Replies
397 Views
I am try to extract the audio file url from Shazamkit, it is deep inside the hierarchy of SHMediaItem > songs > previewAssets > url when I access the url with like this: let url = firstItem.songs[0].previewAssets?[0].url I am getting a warning like this: here is the Variable Viewer this is what I have done so far: struct MediaItems: Codable { let title: String? let subtitle: String? let shazamId: String? let appleMusicId: String? let appleMusicUrL: URL? let artworkUrl: URL? let artist: String? let matchOffset: TimeInterval? let videoUrl: URL? let webUrl: URL? let genres: [String] let isrc: String? let songs: [Song]? } extension SwiftFlutterShazamKitPlugin: SHSessionDelegate{ public func session(_ session: SHSession, didFind match: SHMatch) { let mediaItems = match.mediaItems if let firstItem = mediaItems.first { // extracting the url let url = firstItem.songs[0].previewAssets?[0].url let _shazamMedia = MediaItems( title:firstItem.title!, subtitle:firstItem.subtitle!, shazamId:firstItem.shazamID!, appleMusicId:firstItem.appleMusicID!, appleMusicUrL:firstItem.appleMusicURL!, artworkUrl:firstItem.artworkURL!, artist:firstItem.artist!, matchOffset:firstItem.matchOffset, videoUrl:firstItem.videoURL!, webUrl:firstItem.webURL!, genres:firstItem.genres, isrc:firstItem.isrc!, songs:firstItem.songs ) do { let jsonData = try JSONEncoder().encode([_shazamMedia]) let jsonString = String(data: jsonData, encoding: .utf8)! self.callbackChannel?.invokeMethod("matchFound", arguments: jsonString) } catch { callbackChannel?.invokeMethod("didHasError", arguments: "Error when trying to format data, please try again") } } }
Posted Last updated
.
Post not yet marked as solved
0 Replies
402 Views
We're looking to integrate ShazamKit, but can't find any details of associated costs. Is there a fee or rate limits for matching? And is attribution required to the matched song on Apple Music? Thank you
Posted
by _Jay.
Last updated
.
Post not yet marked as solved
1 Replies
1.2k Views
We are using ShazamKit SDK for Android and our application sometimes crashes when performing an audio recognition. We get the following logs: Cause: null pointer dereference backtrace: #00 pc 000000000000806c /data/app/lib/arm64/libsigx.so (SHAZAM_SIGX::reset()) (BuildId: 40e0b3c4250b21f23f7c4ec7d7b88f954606d914) #01 pc 00000000000dc324 /data/app//oat/arm64/base.odex at libsigx.SHAZAM_SIGX::reset()(reset:0) at base.0xdc324(Native Method)
Posted
by guerwan.
Last updated
.
Post not yet marked as solved
0 Replies
446 Views
Shazamkit's SHManagedSession() doesn't work on macOS 14 RC 23A339 Error code: AddInstanceForFactory: No factory registered for id <CFUUID 0x600000540340> F8BB1C28-BAE8-11D6-9C31-00039315CD46 HALC_ShellDevice.cpp:2,609 HALC_ShellDevice::RebuildControlList: couldn't find the control object Prepare call ignored, the caller does not have record permission Error The operation couldn’t be completed. (com.apple.ShazamKit error 202.)
Posted
by Ruizhe.
Last updated
.
Post not yet marked as solved
1 Replies
523 Views
I want use SHLibrary.default.items to show the music i recognized by Shazam. but SHLibrary.default.items always return empty list. I did an experiment and I called SHLibrary.default.items as soon as I entered on a page and it returned an empty list, but after use SHManagedSession to identify songs and then call SHLibrary.default.items it returned the result I wanted. Below is the test code private func bindEvent() { // call when View was create the items return empty if #available(iOS 17, *) { let items = SHLibrary.default.items print("-------->>>>>>>>\(items)") } self.addToMediaLibray.onTap { [weak self] in guard let `self` = self, let result = self.result, let appleMusicID = result.appleMusicID else { return } if #available(iOS 17, *) { // call when music was recognized the item is not empty. let items = SHLibrary.default.items print("1111-------->>>>>>>>\(items)") } } } The attach file is the part of result log My iOS Verion is iOS 17 (21A5326a) XCode Version is 15.0 beta 8 (15A5229m) The result log
Posted
by tbfungeek.
Last updated
.
Post not yet marked as solved
2 Replies
1k Views
In the video, there is a demonstration of shazam tool. Where do I download it from? What is the process of getting it on mac os.
Posted
by greenpau.
Last updated
.
Post not yet marked as solved
3 Replies
786 Views
https://developer.apple.com/documentation/shazamkit/shazamkit_dance_finder_with_managed_session The song detection is successful however with new APIs, I can't find this demo working with SHLibrary, it expect to display the RecentDanceRowView. I wonder if I missed any steps or the SHLibrary is not ready yet.
Posted
by Ruizhe.
Last updated
.
Post not yet marked as solved
0 Replies
451 Views
Looking to fill up a budget line for an iOs application we are trying to build. By adding ShazamKit to our app, how much cost the use of it per stream ? My research indicates that it is 0,00065$, correct ?
Posted
by utrema.
Last updated
.
Post not yet marked as solved
0 Replies
437 Views
I'm trying to get ShazamKit for Android to work. I have a catalog that I am downloading from an external service, caching in internal app storage and reading from internal app storage. Doing so I'm getting no matches. However if I manually download the file from internal app storage to my computer and put it in the assets folder and read it from there I'm getting matches. So the issue must be in the reading of the file. See comments in the code below. Here's the code: private const val BUFFER_SIZE = 3840 class ShazamService(private val app: Application) { private val coroutineScope = CoroutineScope(Dispatchers.IO + Job()) private val repository = ShazamRepository(...) private val catalog = ShazamKit.createCustomCatalog() private val recorder by lazy { AudioRecording(app) } private var session: StreamingSession? = null suspend fun initialize(source: Source) { // This method does not work addCatalog(source) // This works when used // loadCustomCatalog() session = (ShazamKit.createStreamingSession( catalog, AudioSampleRateInHz.SAMPLE_RATE_48000, BUFFER_SIZE ) as ShazamKitResult.Success).data session?.recognitionResults()?.onEach { matchResult -> onMatch(matchResult) }?.flowOn(Dispatchers.Main)?.launchIn(coroutineScope) } // This works private suspend fun loadCustomCatalog() { val assetManager = app.assets val inputStream: InputStream = assetManager.open("catalog.shazamcatalog") catalog.addFromCatalog(inputStream) } // This does not work private suspend fun addCatalog(source: Source) { repository.loadFile(app.applicationContext, source)?.use { data -> val result = this.catalog.addFromCatalog(data) Timber.d("Catalog added: $result") } } fun start() { recorder.startRecording { data -> session?.matchStream(data, data.size, 0) } } fun stop() { recorder.stopRecording() } private fun onMatch(result: com.shazam.shazamkit.MatchResult) { Timber.d("Received MatchResult: $result") } fun destroy() { coroutineScope.cancel() } } Here's the repository responsible for providing the catalog file. Source contains an id and a url from which a catalog can be downloaded. It downloads the catalog and saves it as a file in internal app storage and returns a FileInputStream. class ShazamRepository( private val shazamClient: ShazamClient ) { suspend fun loadFile(context: Context, source: Source): FileInputStream? { val file = File(context.filesDir, source.id + ".shazamcatalog") val catalog = loadFile(file) if (catalog == null) { val response = shazamClient.getCatalog(source.url) if (response.isSuccessful) { response.body()?.let { data -> saveResponseData(data, file) } } } else { return catalog } return loadFile(file) } private fun saveResponseData(data: ResponseBody, file: File) { data.byteStream().use { inputStream -> FileOutputStream(file).use { outputStream -> val buffer = ByteArray(4 * 1024) var read: Int while (inputStream.read(buffer).also { read = it } != -1) { outputStream.write(buffer, 0, read) } outputStream.flush() } inputStream.close() } } private fun loadFile(file: File): FileInputStream? { return if (file.exists()) { FileInputStream(file) } else { return null } } } To summarise: The catalog is downloaded and saved correctly. If the catalog file is opened with assetManager.open I'm getting matches. When using FileInputStream(file) no matches are received. What could be wrong with the File-API approach? Why does it work when using the AssetManager but not when using it as a File?
Posted
by adpal.
Last updated
.
Post not yet marked as solved
0 Replies
429 Views
ShazamKit is integrated into my music app, but some users have reported issues with the music recognition function not working for them. I am wondering if ShazamKit's functionality could be limited in certain countries or regions. Any insights on this matter would be greatly appreciated.
Posted
by Denis_M.
Last updated
.
Post not yet marked as solved
1 Replies
567 Views
I try to match a microphone audio with a custom catalog which I created via ShazamKit. What is the code for extracting and displaying "matchedMediaItem.artist" information on my iPhone screen after finding a song match with an entry of my custom-built catalog? I am a beginner.
Posted Last updated
.
Post not yet marked as solved
2 Replies
602 Views
Hi, I am using ShazamKit to detect songs from a live stream. I am using matchStreamingBuffer with a PCMBuffer. It looks like it works for the most part, but sometimes it throws an NSException. Here's the code calling the match: engine.mainMixerNode.installTap(onBus: 0, bufferSize: 4096, format: options.audioFormat) { buffer, time in do { self.session.matchStreamingBuffer(buffer, at: time) } catch { } } The exception: Supplied audio format is not supported <CMAudioFormatDescription 0x2828a29e0 [0x20f7863a0]> { mediaType:'soun' mediaSubType:'lpcm' mediaSpecific: { ASBD: { mSampleRate: 44100.000000 mFormatID: 'lpcm' mFormatFlags: 0x29 mBytesPerPacket: 4 mFramesPerPacket: 1 mBytesPerFrame: 4 mChannelsPerFrame: 2 mBitsPerChannel: 32 } cookie: {(null)} ACL: {Stereo (L R)} FormatList Array: { Index: 0 ChannelLayoutTag: 0x650002 ASBD: { mSampleRate: 44100.000000 mFormatID: 'lpcm' mFormatFlags: 0x29 mBytesPerPacket: 4 mFramesPerPacket: 1 mBytesPerFrame: 4 mChannelsPerFrame: 2 mBitsPerChannel: 32 }} } extensions: {(null)} } This is the stack stack: 0 CoreFoundation 0xa248 __exceptionPreprocess 1 libobjc.A.dylib 0x17a68 objc_exception_throw 2 ShazamKit 0x159d0 -[SHMutableSignature appendBuffer:atTime:error:] 3 ShazamKit 0x6d7c -[SHSignatureGenerator appendBuffer:atTime:error:] 4 ShazamKit 0x3968 -[SHSessionDriverSignatureSlot appendBuffer:atTime:error:] 5 ShazamKit 0x10430 -[SHSignatureBuffer flow:time:] 6 ShazamKit 0x2490 -[SHStreamingSessionDriver flow:time:] 7 ShazamKit 0xf784 -[SHSession matchStreamingBuffer:atTime:] 8 MyApp 0x17f69c thunk for @escaping @callee_guaranteed (@guaranteed AVAudioPCMBuffer, @guaranteed AVAudioTime) -> () (<compiler-generated>) 9 AVFAudio 0x482ac AVAudioNodeTap::TapMessage::RealtimeMessenger_Perform() 10 AVFAudio 0x71c4 CADeprecated::RealtimeMessenger::_PerformPendingMessages() 11 AVFAudio 0x471e4 invocation function for block in CADeprecated::RealtimeMessenger::RealtimeMessenger(applesauce::dispatch::v1::queue) I don't mind failing if the format is not good, but how can I avoid crashing?
Posted
by wotson.
Last updated
.