The first u32 describes the vibration device type which is a Linear Resonant Actuator used in Nintendo Switch controller hardware.
The second u32 describes the vibration device position, in this case distinguishing between left and right vibration actuators.
Pro Controllers have 2 LRAs each that can vibrate independently of each other, which means they have 2 distinct vibration device handles to distinguish between the two actuators.
Similarly for joycons, the left joycon can be distinguished from the right joycon through the vibration device handle since each joycon has 1 LRA.
Resolves numerous deprecation warnings throughout the codebase due to
inclusion of this header. Now building core should be significantly less
noisy (and also relying on less global state).
This also uncovered quite a few modules that were relying on indirect
includes, which have also been fixed.
The interrupt handler contains a std::atomic_bool, which isn't copyable
or movable, so the special move member functions will always be deleted,
despite being defaulted.
This can resolve warnings on clang and GCC.
Some games like Cave Story+ set invalid values in the ControllerPrivateArg's mode and caller fields.
Use other fields to determine the appropriate mode and caller should either or both fields be invalid.
- This works similiar to GetAlbumContentsFileListForApplication.
- Since we do not implement the album, this should be safe to stub for now.
- Used by Super Smash Bros. Ultimate (newer updates) in World of Light.
Unicorn long-since lost most of its use, due to dynarmic gaining support
for handling most instructions. At this point any further issues
encountered should be used to make dynarmic better.
This also allows us to remove our dependency on Python.
This commit aims to implement the NVDEC (Nvidia Decoder) functionality, with video frame decoding being handled by the FFmpeg library.
The process begins with Ioctl commands being sent to the NVDEC and VIC (Video Image Composer) emulated devices. These allocate the necessary GPU buffers for the frame data, along with providing information on the incoming video data. A Submit command then signals the GPU to process and decode the frame data.
To decode the frame, the respective codec's header must be manually composed from the information provided by NVDEC, then sent with the raw frame data to the ffmpeg library.
Currently, H264 and VP9 are supported, with VP9 having some minor artifacting issues related mainly to the reference frame composition in its uncompressed header.
Async GPU is not properly implemented at the moment.
Co-Authored-By: David <25727384+ogniK5377@users.noreply.github.com>