- Aug 26, 2024
-
-
Signed-off-by:
Claudio Cambra <developer@claudiocambra.com>
-
Signed-off-by:
Claudio Cambra <developer@claudiocambra.com>
-
Signed-off-by:
Claudio Cambra <developer@claudiocambra.com>
-
GetLatency calls [AVAudioSession outputLatency] in the following callstack: - _malloc_zone_malloc_instrumented_or_legacy - _Block_copy - -[_NSXPCDistantObject _initWithConnection:proxyNumber:generationCount:interface:options:error:] - -[NSXPCConnection synchronousRemoteObjectProxyWithErrorHandler:] - caulk::xpc::message<id<SessionManagerXPCProtocol> __strong, objc_object* __strong, unsigned int>::sync_proxy() - GetPropertyXPC(std::__1::shared_ptr<as::client::XPCConnection>, unsigned int, NSString*, bool) - GetProperty(AVAudioSessionImpl*, NSString*, bool) - float GetProperty_DefaultToZeroXPC<float>(AVAudioSessionImpl*, NSString*, bool) - -[AVAudioSession outputLatency] - GetLatency - GetLatency - ca_Render - RenderCallback - ausdk::AUInputElement::PullInput(unsigned int&, AudioTimeStamp const&, unsigned int, unsigned int) - AUInputFormatConverter2::InputProc(OpaqueAudioConverter*, unsigned int*, AudioBufferList*, AudioStreamPacketDescription**, void*) - caulk::expected<unsigned int, int> caulk::function_ref<caulk::expected<unsigned int, int> (ACAudioSpan&)>::functor_invoker<acv2::AudioConverterV2::fillComplexBuffer(int (*)(OpaqueAudioConverter*, unsigned int*, AudioBufferList*, AudioStreamPacketDescription**, void*), void*, unsigned int*, AudioBufferList*, AudioStreamPacketDescription*, AudioStreamPacketDependencyInfo*)::$_2>(caulk::details::erased_callable<caulk::expected<unsigned int, int> (ACAudioSpan&)> const&, ACAudioSpan&) - acv2::AudioConverterChain::ObtainInput(acv2::AudioConverterBase&, unsigned int) - acv2::CBRConverter::ProduceOutput(ACAudioSpan&) - acv2::AudioConverterChain::ProduceOutput(caulk::function_ref<caulk::expected<unsigned int, int> (ACAudioSpan&)>, ACBaseAudioSpan&) - acv2::AudioConverterV2::fillComplexBuffer(int (*)(OpaqueAudioConverter*, unsigned int*, AudioBufferList*, AudioStreamPacketDescription**, void*), void*, unsigned int*, AudioBufferList*, AudioStreamPacketDescription*, AudioStreamPacketDependencyInfo*) - with_resolved(OpaqueAudioConverter*, caulk::function_ref<int (AudioConverterAPI*)>) - AudioConverterFillComplexBuffer - AUConverterBase::RenderBus(unsigned int&, AudioTimeStamp const&, unsigned int, unsigned int) - AURemoteIO::RenderBus(unsigned int&, AudioTimeStamp const&, unsigned int, unsigned int) - ausdk::AUBase::DoRender(unsigned int&, AudioTimeStamp const&, unsigned int, unsigned int, AudioBufferList&) - AURemoteIO::PerformIO(unsigned int, unsigned int, unsigned int, AudioTimeStamp const&, AudioTimeStamp const&, AudioBufferList const*, AudioBufferList*, int&) - _XPerformIO - mshMIGPerform - MSHMIGDispatchMessage - void* caulk::thread_proxy<std::__1::tuple<caulk::thread::attributes, AURemoteIO::IOThread::IOThread(AURemoteIO&, caulk::thread::attributes const&, caulk::mach::os_workgroup_managed const&)::'lambda'(), std::__1::tuple<>>>(void*) - _pthread_start - thread_start The call to Objective-C function might generate objects released from autoreleasepool, which seems to be the case with the call to [NSXPConnection synchronousRemoteObjectProxyWithErrorHandler][^1]. I've not found where exactly `[[block copy] autorelease]` was called but setting up the autoreleasepool before unwinding back to the C code removes the accumulatoin of memory on this specify callstack. [^1]: https://developer.apple.com/documentation/foundation/nsxpcconnection/2879410-synchronousremoteobjectproxywith?language=objc
-
- Aug 25, 2024
-
-
Clippy reports[^1] those transmute as being non-necessary, and it seems that it now works correctly with `as *mut c_void` without the additional template typing parameters. [^1]: https://rust-lang.github.io/rust-clippy/master/index.html#/transmutes_expressible_as_ptr_casts
-
The de-duplication was checking whether $plugin was in the PLUGINS array already, but plugins like ts_plugin would match mux_ts_plugin and would not be copied to the final application.
-
- Aug 24, 2024
-
-
Steve Lhomme authored
-
Steve Lhomme authored
It works well with WDDM 2.3 and above. Otherwise we fallback to the old way.
-
- Aug 23, 2024
-
-
On some configurations `d3d9.dll` might not be present, such as Windows 11 PE, reportedly.
-
`CreateDXGIFactory2()` is not available on Windows 7, and it seems possible to create `IDXGIFactory2` using `CreateDXGIFactory1()`.
-
-
-
-
-
-
-
-
-
-
-
-
refs #28747
-
Signed-off-by:
Claudio Cambra <developer@claudiocambra.com>
-
Signed-off-by:
Claudio Cambra <developer@claudiocambra.com>
-
- Aug 22, 2024
-
-
Steve Lhomme authored
-
Steve Lhomme authored
It's no longer about whether the shaders use Texture Arrays or not.
-
Steve Lhomme authored
Before D3D11.1 it's not possible. But we can use a VideoProcessor to transform the decoded format into RGB. This is equivalent to what D3D9 is doing behind our back.
-
Steve Lhomme authored
We don't use texture array in the shader since ddf67a90.
-
Steve Lhomme authored
-
Steve Lhomme authored
These textures can't be loaded in the shaders. But they can be processed by a VideoProcessor to output some RGB format that we can display. This is equivalent to what D3D9 would do behind our back, but in D3D11.
-
Steve Lhomme authored
No need to pass the actual decoder format that may be used. Check the hardware format for VideoProcessor input support, otherwise we can't use a VideoProcessor.
-
Steve Lhomme authored
-
Steve Lhomme authored
The VideoProcessor will make it properly opaque.
-
Steve Lhomme authored
We already check the proper internal format of the decoder.
-
Steve Lhomme authored
The decoder output format is decided by the deocder, not the display module anymore.
-
Steve Lhomme authored
If not we should favor another display module, unless D3D11 is forced.
-
Steve Lhomme authored
Remove bogus AMD TextureArray handling. We don't use texture array in the shader since ddf67a90.
-
Steve Lhomme authored
No functional changes.
-
Steve Lhomme authored
-