PDA

View Full Version : Spotted a probable solution for texture glitches with Intel graphics chips


overcoder
10th July 2011, 01:15 AM
Hi N64 lovers,

This post outlines a PJ64 1.6 issue and a possible solution, which could also be an improvement for PJ64 1.7, so I put it here.

I regularly play and replay my favorite N64 games using the great (if not the greatest) PJ64 emulator (many many thanks to the authors !). But I faced as many other players (here (http://forum.pj64-emu.com/showthread.php?t=2528), or here (http://forum.pj64-emu.com/showthread.php?t=454)) many glitches, especially texturing glitches in video rendering with Intel Graphics chips.

Long ago, I tried to investigate the issue and I think I've come with a possible solution. I'm a not so happy user of a Mobile Intel(R) 965GM Express Chipset, which would not work with PJ64 1.6, except with two video plugins :


The official Jabo's Direct3D8 1.6
Rice's Video Plugin 6.1.1 beta 10


I've tested many others, but no joy, nothing playable. In my case, these two are very stable when keeping their default configuration. I don't remember their last crash or freeze ...

The only remaining problem is about textures not "covering surfaces", and sometimes disappearing from near point of view. The problem is always present with Jabo's plugin, and only present with Rice's plugin if I keep Depth Buffer on the default value, i.e. 16 bits (I know what is a Depth Buffer, I played a little with OpenGL). If I switch it to 32 bits, rendering textures is perfect, "N64 speaking" (I checked in my N64 at many points, and for many games)

It looked like Rice's plugin was the final solution, but it fails to handle menu screens in Zelda OOT and some other "crucial" situations, like JiggyWiggy's puzzles in Banjo-Tooie. Jabo's plugin worked perfectly for anything else, except the texture rendering !

A little screenshot of the depth buffer option :
http://overcoder.files.wordpress.com/2011/07/rice-plugin-depth-option.png

I took screenshots for the following video configs when playing my favorite games :

Jabo's Direct3D8 1.6 (no options to tweak)
Rice's plugin 6.1.1b10 (depth buffer 16 bits)
Rice's plugin 6.1.1b10 (depth buffer 32 bits)



http://overcoder.files.wordpress.com/2011/07/sm64-2-jabo.png
http://overcoder.files.wordpress.com/2011/07/sm64-2-rice-depth16.png
http://overcoder.files.wordpress.com/2011/07/sm64-2-rice-depth32.png


The number of included screenshots is limited when posting, so I attached an archive with many others.

In the case of Rice's plugin with depth buffer storing 16 bits z-indexes, two stars on the doors behind Mario are not drawn at all. The sun carpet in the middle of the room is also not shown. And worst of all, Mario has no shadow !! It's a vampire ! (dummy joke, I agree :))

Switching to 32 bits, everything is drawn. Jabo's plugin is somewhat between the two cases. Stars on the doors are drawn, but the carpet is not shown, except from far point of views. Innocently, I would say it's a 24bits behavior ...

32 bits solved all my "disappearing textures" problems. No more road missing or symbols not shown inside temples with Zelda OOT (I remember having to deal with a maze where a yellow arrow was drawn on the floor, just next a crate to push, but not shown to me when playing, keeping me scratching my head an entire afternoon), or wrap portal numbers not shown and patchwork landscape in Donkey Kong 64, or no Mario shadow on the floor, so any little difficult jump make you restart your stage ...

If it is not already using 32bits as a buffer depth option, could Jabo's plugin include this option and expose it through its configuration dialog ? And could it be made as an update to Jabo's plugin for PJ64 1.6 ? Intel graphic cards users will be so grateful !

Anyway, thanks for all the efforts put by the PJ64 team, playing my favorite N64 games is priceless !!

Note : I justed started learning English a couple of months ago, sorry for any mistakes !

overcoder
11th July 2011, 12:31 PM
No one has something to say ? Am I posting in the wrong place ?

HatCat
11th July 2011, 06:23 PM
Sorry, actually I was just about to submit a reply on this a couple nights ago, but just as I clicked submit my connection died out XD. Then I forgot.

Even back when I had my medium-end NVIDIA GeForce FX 5200 (with or without MSFT's driver versions in place) I hadn't noticed a difference when using 16 or 32 bits with the depth buffer setting. It also said from the plugin tool tips, "You don't need to modify this setting." if you have the tool tips enabled in the plugin settings on the general options.

But this is basically what I had thought. Rice's older video plugins were also designed to include some support for older graphics cards and compatibility in the form of these kinds of features. Some of that was eventually removed before 6.1.0 versions, though, but according to your testing here this option pretty much does what I thought it would.

Heh, at first when I was looking at your comparison screenshots, I was like, WTF, I know there's some sort of major difference here, I can't quite make it out (had to keep scrolling up and down). Then when you explained it out that the carpet and door stars were missing I was like omg....

overcoder
26th July 2011, 12:39 AM
Thanks for your reply ! My response comes late, I was away from any keyboard for about two weeks :)

I'll appreciate if I can hear other Intel graphic cards users, and the results they obtain with different settings of the video plugins. If the results I obtain are reproducible, I think the mainstream team can invest some precious time in adding/fixing/testing ...

I'm available to *heavily test* any idea under an Intel graphic card. Those chips are quite present and embedded in many not-so-old computers. I really want Project64 v1.7 to be the perfect N64 emulator !

TheRealM
26th July 2011, 01:23 AM
A solution is the software renderer.

dsx_
26th July 2011, 08:29 AM
A solution is the software renderer.

isnt that slow as fuck though?

HatCat
27th July 2011, 12:11 AM
So have you tried testing the OpenGL engine on the Intel chip you're using, if the setting fixes this using that as well?

Also Jabo is currently testing on Intel, though it's probably a newer model than what you're using and doesn't have issues like this.

daaceking
11th August 2011, 03:10 AM
My intel HD series runs 1.6 brilliantly. Intel is new to this graphics thing so your model was probably nt the right version to get if you know what I mean.

HatCat
11th August 2011, 06:16 AM
oh definitely lol, Intel was just as terrible for N64 emulation as anything else years ago

michaeltheiii
5th October 2011, 04:14 AM
Hello. I you might have seen in 1.6 forums, I also had these problems. I found that the best, but only for Super Mario 64 is the plugin glN64. So far, the only problems for that plugin is the Peach picture fading to Bowser or not. Hopefully this is a good edition!

HatCat
8th October 2011, 01:06 AM
glN64 has only experimental support for the dithered alpha blending (the dissolve effect) in the game. Direct64's "PC noise at resolution" option supports Mario64's dissolve effect better, and Glide64 does, too, with more qualitative support for texture LOD and shit like that (e.g. the picture fading error you just described that happens on most other plugins though not all).

squall_leonhart
17th February 2012, 09:48 PM
Resolved in 1.6.1

if its not fixed on your pos Half implemented Intel GPU then take it up with the idiots that decided intel didn't need a real vertex shader.

Just to note, the Geforce 4MX also had this issue when Nvidia stopped emulating the Vertex shader with Release 60 drivers.

mastertwitch
13th March 2014, 06:31 PM
Hey your tips on configuring Rices plugin totally fixed the shadows, for now I guess!!! anyways thanks again buddy