Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Virtua Fighter 32X softlocks #237

Open
CometHunter92 opened this issue Nov 6, 2024 · 32 comments
Open

Virtua Fighter 32X softlocks #237

CometHunter92 opened this issue Nov 6, 2024 · 32 comments

Comments

@CometHunter92
Copy link

Latest commit, on the RG35XX (32 bit), Virtua Fighter 32X softlocks consistently. Basically every time I start it, it softlocks on the first stage. Didn't happen before. I don't have other 32 bit arm devices around now, I'll test it on the Pi4 (64 bit) that I have soon.

@CometHunter92
Copy link
Author

It happens on arm 64 bit too, just tested it on a Pi4. It softlocks consistently there too, with freezed gameplay but music still playing.

@irixxxx
Copy link
Collaborator

irixxxx commented Nov 8, 2024

Is this also happening with a standalone PicoDrive version?

@CometHunter92
Copy link
Author

I would need to compile the binaries for those devices. Are you interested on the stand alone behaviour on those platforms or I can test it on windows binaries too?

@CometHunter92
Copy link
Author

CometHunter92 commented Nov 12, 2024

Doesn't softlocks with the standalone win32 binaries. Tomorrow I'll try to compile a standalone version on the Pi4.

@irixxxx
Copy link
Collaborator

irixxxx commented Nov 16, 2024

I'm tied up in November. I'll deal with it next month.

@CometHunter92
Copy link
Author

CometHunter92 commented Nov 17, 2024

No worries at all, and thanks for all the work you put into this core!

@irixxxx
Copy link
Collaborator

irixxxx commented Dec 6, 2024

I've tried to reproduce it under OSX and Linux on ARM 64 bit, but no luck. Do you have any changed option settings for retroarch and/or picodrive?

@CometHunter92
Copy link
Author

CometHunter92 commented Dec 7, 2024

PicoDrive settings all stock beside the 6 buttons pad as device 1. Retroarch nothing fancy, no run ahead etc.
Still happens on all my platforms, I've just tried it on 4 different devices. It doesn't happen always, but it is more probable when you spam the "B" button (punch) in the cutscenes too. It happen 3 times to 4 on the cutscenes. I have managed to create a log, it seems like something happens at a certain point and it just unloads the rom (?). The ROM itself has a verified checksum.
retroarch__2024_12_07__01_42_24.log

[INFO] [Environ]: SET_GEOMETRY: 320x224, Aspect: 1.429.
[libretro INFO] 00002:072: 32X startup
[libretro INFO] 00002:072: drc_cmn_init: 0x7f9dea0000, 4194304 bytes: 0
[INFO] [Environ]: SET_GEOMETRY.
[INFO] [Environ]: SET_GEOMETRY.
[INFO] [Core]: Content ran for a total of: 00 hours, 01 minutes, 08 seconds.
[INFO] [Core]: No content, starting dummy core.
[INFO] [Core]: Content ran for a total of: 00 hours, 00 minutes, 00 seconds.
[INFO] [Core]: Unloading game..
[INFO] [Core]: Unloading core..
[INFO] [Core]: Unloading core symbols..
[INFO] [Core]: Saved core options file to "/run/muos/storage/info/config/PicoDrive/PicoDrive.opt".

@CometHunter92
Copy link
Author

It happens with dynarec disabled too.

@irixxxx
Copy link
Collaborator

irixxxx commented Dec 7, 2024

Does it also happen in the intro?

@CometHunter92
Copy link
Author

Yes. After some cycles of the intro it locked. I found that it "unlocks" if you save a state and load it back. When you load the state of the "lock" it unlocks itself.

@irixxxx
Copy link
Collaborator

irixxxx commented Dec 9, 2024

I have the intro running for hours, but absolutely no locking. Send me your picodrive config file please. I want to make sure we operate under the same conditions.

@CometHunter92
Copy link
Author

Have you tried closing the content and reopening? First time I tried on the intro it didn't lock on fast forwards for 15 minutes. It's totally random.
BTW, here's the PicoDrive opt file, Retroarch cfg file and a save state of the "lock"; nevertheless if you load it it will unlocks.
virtuapico.zip

@irixxxx
Copy link
Collaborator

irixxxx commented Dec 10, 2024

Hmm, no, it apparently doesn't happen here :-/
Would you be able to bisect with git to point me to the commit which causes this?

@irixxxx
Copy link
Collaborator

irixxxx commented Dec 10, 2024

Is this maybe related to the retroarch version? Which version(s) are you using on your platforms?

@irixxxx
Copy link
Collaborator

irixxxx commented Dec 10, 2024

Based on the information found in your save state, would you check if this patch is helping, please?
y.txt

@CometHunter92
Copy link
Author

I'm using different devices with different Retroarch versions. The patch didn't work, it locked after some seconds of the first match. BTW I compiled a version with libretro VFS support disabled (it's enabled by default), it's half an hour than it's running without locking. The fact that the log said that it was unloading the content and the fact that loading a save state of the lock reloaded the content back made me think about that. Maybe it's just lucky, I'm dumb so I don't know.

@irixxxx
Copy link
Collaborator

irixxxx commented Dec 10, 2024

From what I see in your save state I'd say that isn't very probable:

  • PicoDrive uses a polling detection to check if one or more of the CPUs are currently just checking the same memory location for something over and over again.
  • if polling is detected, that CPU is "turned off" to save on emulation time
  • I see in your save state that the 68K CPU is actually at a location where polling might be detected
  • however, the actual polling state isn't saved in the save file since it regenerates itself anyway just after some emulation cycles, hence I can't see if it really has detected the polling state.
  • that also explains why it's running fine after loading, since any detected polling state is removed after loading.

Now, the polling state for this type of loop is only detected in exactly the one place where the patch applies. You could theoretically play with the number (lower values), or you could remove the poll detection call which is some lines below that to check it would work with that.

The strange fact that I can't actually reproduce it is bothering me. What's different between all your different setups and my 3 ones?

@irixxxx
Copy link
Collaborator

irixxxx commented Dec 10, 2024

To sum up:

  • different RA versions, so it's probably not that.
  • different devices with different CPU types (which?), so it's not the recompiler backend.
  • apparently not happening with a standalone version.
  • not happening at all in my setup (RA on x86 OSX, RA on linux ARM64, standalone on OSX and linux ARM64).

I used your core settings, so it's not that. Taking over the RA settings is a bit of work since all the path information in it isn't fitting for my install, but I'll try that next.

I'm currently at a loss. I can only ask you to try a git bisect to find the commit causing this so that I have a pointer where to look.

@irixxxx
Copy link
Collaborator

irixxxx commented Dec 11, 2024

Just to make sure it's not that: what's the image checksum?

@CometHunter92
Copy link
Author

CometHunter92 commented Dec 11, 2024

Thanks for the insights! Sorry for the late response.
So far, the core compiled with libretro VFS support disabled hasn't locked. However, when using the core with VFS enabled, it consistently hangs. I know it seems odd, but that's what I've observed. The MD5sum of the image is 901e97c9f731fbdf1f1ead0fbf58249a.
I'm testing this on a couple of H700 devices, an RG35XX with a Cortex-A9, and a Raspberry Pi 4.

@irixxxx
Copy link
Collaborator

irixxxx commented Dec 11, 2024

Hmm. I made a libretro-testing branch in my repo and updated the libretro-common files. Could you please compile that branch from my repo and check if that works better?

@CometHunter92
Copy link
Author

Tested it a little bit, seems ok. Didn't lock yet. Btw it needs this irixxxx#151 or it won't compile.

@irixxxx
Copy link
Collaborator

irixxxx commented Dec 12, 2024

Really baffling. I can't see any of that lr-common stuff being using in the emulation itself. How is this possible?

@CometHunter92
Copy link
Author

From the log it seemed like the content was unloaded just before it locked. So I guess it could really be something related to the VFS. Isn't VFS responsible for the file handling? I guess for the emulation itself it's like the cartridge was ejected.

@irixxxx
Copy link
Collaborator

irixxxx commented Dec 12, 2024

I don't need your patch to compile the branch neither under osx nor linux, using make -f Makefile.libretro. Is there something I'm missing?

@CometHunter92
Copy link
Author

I'm cross compiling on WSL, Ubuntu 20.04 gcc 9.4.0 . Without that line it says

platform/libretro/libretro.c:44:10: fatal error: platform/common/upscale.h: No such file or directory
   44 | #include <platform/common/upscale.h>
      |          ^~~~~~~~~~~~~~~~~~~~~~~~~~~
compilation terminated.
make: *** [Makefile:443: platform/libretro/libretro.o] Error 1

@irixxxx
Copy link
Collaborator

irixxxx commented Dec 12, 2024

The Makefilehas this line:

CFLAGS += -I$(PWD)

Are you using something like make -C <dir> -f Makefile.libretrofor compiling?

@irixxxx
Copy link
Collaborator

irixxxx commented Dec 12, 2024

something like CFLAGS += -I$(dir $(realpath $(shell echo "$(MAKEFILE_LIST)" | sed 's/Makefile.*/Makefile/'))) would probably work best, even if the list has whitespace in it.

@irixxxx
Copy link
Collaborator

irixxxx commented Dec 13, 2024

I've commited a small change to the branch. Could you check if this compiles for you without your PR?

@CometHunter92
Copy link
Author

CometHunter92 commented Dec 13, 2024

Are you using something like make -C <dir> -f Makefile.libretro for compiling?

Nope, I was compiling from the root directory.
BTW I've just copied that line from the libretro fork...

I've committed a small change to the branch. Could you check if this compiles for you without your PR?

Yes, now it compiles correctly without issues!

@CometHunter92
Copy link
Author

There are more reports of people having specifically performance issues with the latest libretro core, issues solved by the core compiled from your repo, libretro-testing branch.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants