Forum Replies Created

Viewing 19 posts - 1 through 19 (of 19 total)
  • Author
    Posts
  • davej
    Participant
    Post count: 21
    in reply to: Serious Sam ? #120712

    First of all, they’ve announced they have release the source for the game engine used to create Serious Sam – not that they have released the source to the game itself.

    I had a quick browse of the released code and this is what I found:

    1. It’s the source code only. There are things like level editors included and although there is the source to a program called SeriousSam, it’s not clear whether this is the actual game or just a test app.

    2. It’s doesn’t include the resources associated with the game (3D models, textures, etc.). There is an example set of resources but that is not the game itself. There’s a comment on github about copying the resources from the game reinforcing this.

    3. It’s very much a Windows application. It assumes it’s running on Windows, was compiled using Microsoft development tools and relies on the Windows libraries.

    I’d say the chances of it being ported to work on the Pi are negligible.

    dave j

    davej
    Participant
    Post count: 21

    [quote=119441]Hello Davej,

    On the RetroPie/common-shaders repo, the file was added on the 13/01. Is this the last version or the one you gave in this thread is the latest one?
    [/quote]

    The most recent version in this thread is the latest one.

    dave j

    davej
    Participant
    Post count: 21

    You can find them at: https://buildbot.libretro.com/assets/frontend/shaders_glsl.zip (My shader hasn’t been added yet.)

    It should work with standard OpenGL. It certainly works on my Linux PC with AMD’s drivers.

    dave j

    davej
    Participant
    Post count: 21

    I’ve managed to speed this up again. It will do 1080P@60Hz on a Pi2 with the proper gamma correction as long as the CPUs aren’t doing too much memory access. It’s often OK but if you run anything too taxing it drops about 1 in 1000 frames – you’ll probably have trouble noticing it.

    Since it’s limited by the CPUs and the GPU fighting over which can access the memory, if you can reduce CPU usage, it’ll be more likely to succeed. Using faster emulators (e.g. snes9x_next rather than pocketsnes) and changing the audio resampler driver to ‘nearest’ both help. Using simpler games means the emulator has to do less work and helps too! ;)

    For other Pis, it might allow emulators to run a little faster but won’t make much difference.

    There are no changes to the visible output.


    @gizmo98

    I’ve posted this to the libretro forum as some people there use Pis and the shader is also useful for people using phones/tablets. They are going to include it in their GLSL shaders pack so, when it appears there, it will make sense to use that version rather than distribute it twice.

    davej

    davej
    Participant
    Post count: 21

    [quote=113702]Thanks very much for the work davej. Looking forward to testing it out.
    Would you say this shader is suggested to be used in place of your original whether or not you have a Pi2? Or is the original better in any way for Pi2 users?

    Thanks again for your time.
    [/quote]
    The new shader, even with full gamma correction, is faster on all Pi versions and so should be considered a replacement for the original.

    dave j

    davej
    Participant
    Post count: 21
    in reply to: Video mode error #112919

    You can configure the mode to use by setting the hdmi_group and hdmi_mode values in /boot/config.txt – see config.txt documentation on raspberrypi.org.

    You can find out which values to set as follows:

    [ul]

  • Boot your Pi having switched the monitor on first so you get a picture.

    Run [b]tvservice -s[/b]

    This will display something like:[code]state 0x120016 [DVI DMT (58) RGB full 16:10], 1680x1050 @ 60.00Hz, progressive[/code]The important bit is the DMT (58). You might have CEA instead of DMT – whichever it is defines the value to use for hdmi_group (1 for CEA, 2 for DMT). The value in brackets is the one to use for hdmi_mode.

  • You can find out which modes your screen supports by running [b]tvservice -m DMT[/b] (or CEA if tvservice -s returned that) and pick another one if you like. It’s a good idea to use one that is 60Hz as that matches the vsync rate games are written to work at.
  • Edit /boot/config.txt and add hdmi_group and hdmi_mode lines with values found above.
  • [/ul]

davej
Participant
Post count: 21

[quote=110787]I recently tested your amazing filters.
As you have already written, it is a bit too slow in 1080p.
Only at a resolution of 1360×768 (most HD Ready devices) it’s really smooth, especially for SNES.
But there is the problem with the colors of the scanlines if the display is curved.
It is somewhat difficult to describe, so I made a screenshot.
I’ve Played around with the filter something, to amplifies the effect.
(but I can not really modify it)
The effect occurs even at 1080p, due to the higher resolution, you can see it but barely.
If the screen is not curved, the effect does not occur.
Maybe you can calculate the scanlines in a different way.
And it would be good if you could adjust the brightness of the scan lines.[/quote]

That’s the moire effect that comes from trying to display curved lines on a display with pixels in a straight line. Some displays show it more than others – one of my monitors barely shows it at all, another shows it a bit and on my TV it’s quite noticeable. Unfortunately the best you can do is try to minimize it if you can.

Some things I did find helped with my TV that you might like to try:

Don’t let the display do any scaling. i.e. Don’t set the Pi to a 720 screen mode and let the TV upscale to 1080. Run the Pi at the screen’s native resolution and set a viewport in Retroarch for the resolution you want. This will result in borders round screen but will produce a better picture.

Setting the screen height to be either 3 times (for a 720 screen) or 4.5 times (for 1080) significantly reduced the moire effect. The easiest way for 3 times is to use integer scaling but you lose the 4:3 aspect ratio if you do that. The best way is to set a custom aspect ratio (Options->Video Options->Custom Ratio). 224×3 = 672 so you only lose 1/16th of the screen height on a 720 display (1/8th for 768). 4.5×224 = 1008 and you again lose 1/16th of the screen height on a 1080 display. To maintain a 4:3 aspect ratio you’d want to set custom ratios of 896×672 or 1344×1008.

Stronger scan lines produce worse moire effects and the value chosen is designed to balance being visible whilst minimizing moire effects. The scan lines are based on those in the crt-geom shader (which seemed to have the best/least moire) and one of the optimizations I did to get my shader going as fast as it does was to simplify the maths around how they are calculated. Doing this lost the bits that allow them to be easily changed in crt-geom.

davej
Participant
Post count: 21

I’ve tried the shader configuration editing that CG shaders support with GLSL shaders and it does work, so here’s an updated version of my shaders. You can now edit most of the configuration parameters in RGUI.

You can stop screen curvature by setting the two curvature values to 0.0 but, as before, commenting out the [b]#define CURVATURE[/b] line is better as it runs faster. (It runs less code rather than runs code which has no effect.)

Multi-sampling is still controlled by editing the shader file to disable code as well for similar reasons.

I tried it with my early 256Mb Pi1 model B and it works pretty well on that (admittedly overclocked a 900Mhz) so it should be OK on a model A too.

davej

davej
Participant
Post count: 21

[quote=109854]Is it possible to skip the screen distortion somehow?

[/quote]
There are some variables set at the top of the shaders/crt-pi.glsl file that can be used to control some aspects of the shader.

#define MULTISAMPLE
#define CURVATURE
#define BARREL_DISTORTION_X 0.15
#define BARREL_DISTORTION_Y 0.25
#define MASK_BRIGHTNESS 0.65
#define SCAN_LINE_WEIGHT 6.0
#define BLOOM_FACTOR 1.5
#define INPUT_GAMMA 2.4
#define OUTPUT_GAMMA 2.2

You can change the curvature by changing the BARREL_DISTORTION_X and BARREL_DISTORTION_Y values (0.0 will turn it off for that dimension). Better, you can turn if off entirely by commenting out the CURVATURE line (put a double slash // at the start of the line). This will cause it to skip the distortion code and speeds up the shader a bit. The barrel and mask shaders can be similarly controlled but just have the BARREL_DISTORTION_X and BARREL_DISTORTION_Y values.

You can similarly comment out the MULTISAMPLE line and speed things up a bit by skipping the multi-sample code – at the risk of increasing moire effects.

MASK_BRIGHTNESS controls how bright the screen will be.

You’re best leaving the others alone. Altering SCAN_LINE_WEIGHT and BLOOM_FACTOR will likely make moire effects worse. The two GAMMA values affect brightness and should not need changing on a correctly configured monitor (and if you monitor is not correctly configured, fixing that is a far better idea).

It’s perhaps worth pointing out that the shader is designed to be used with linear filtering. CRT electron beams are not square and the combination of linear filtering with scan lines and bloom makes the edges of bright pixels next to dark pixels rounded which looks a bit more accurate. People are of course free to set filtering to nearest if they prefer.

I’ll see if I can get the shader parameter editing that works with .cg shaders working with .glsl ones next week. The automatic conversion process normally used strips them out but I don’t know if that’s because they won’t work or just because it’s just a deficiency in the conversion process.

davej

davej
Participant
Post count: 21

[quote=109447] @davej: That shader looks great for running on a Pi at decent speeds. Are you testing it with more heavier systems, mame games perhaps? It’d be great if it could be the end all solution to getting a consistent nostalgia look to all retroarch systems.
[/quote]

I’m focusing entirely on the Pi2 at the moment but it should run on anything supporting OpenGL ES2. I’ll make it compatible with OpenGL 2.1 before release so it can be used with desktop GPUs.

I’ve only tried it with SNES games but it’s a Retroarch shader and so should work with emulators supported by that.

It probably won’t make a good solution for all systems. It’s been designed to work within the limits of the Pi2. Faster GPUs will be able to run more complicated shaders – and multi-pass ones which are performance killers for the Pi. It will probably be good for other mobile GPUs.

davej
Participant
Post count: 21

[quote=109436]I think the second (lanczos2) as it doesn’t have that sort of ghosting effect, but there’s basically nothing in it for me.
[/quote]
I prefer the lanczos2 on my monitors but some games look better with the cubic on my TV. It’s easy enough for me to provide both though so I guess that’s what I’ll do.

davej
Participant
Post count: 21

Last set:

[attachment file=”cubic-3.png”]

[attachment file=”lanczos2-3.png”]

In case you haven’t guessed, I’m writing a Pi2 friendly CRT shader. The aim is to get it doing 1080P@60Hz – which it does so far.

dave j

davej
Participant
Post count: 21

Next set:

[attachment file=”cubic-2.png”]

[attachment file=”lanczos2-2.png”]

davej
Participant
Post count: 21

I’m hijacking this thread to ask, which do you prefer, cubic or lanczos2 filtering? In the following few posts the the first image uses cubic filtering, the second uses lanczos2.

[attachment file=”cubic-1.png”]

[attachment file=”lanczos2-1.png”]

davej
Participant
Post count: 21

I had a look at your file and found the problem. The forum has changed the minus signs to dashes – so OpenGL no longer thinks it’s a valid shader program.

Change them to minus signs and it should be OK.

davej
Participant
Post count: 21

I didn’t have to do anything that’s not included here. Once you select Apply Shader Changes you can see effect of the shader as the menu is semi-transparent.

davej
Participant
Post count: 21

Save everything from the [b]#if defined(VERTEX)[/b] to the [b]#endif[/b] inclusive to a file with a name that ends in .glsl and copy it into the same directory as the other .glsl files. It should list it when you next try to pick a shader in retroarch.

You can tweak the amount of barrel distortion by altering the value in the BARREL_DISTORTION line. Changing it to 0.0 will produce no distortion, the 0.25 it’s currently set to is quite large to demonstrate the effect.

davej
Participant
Post count: 21

I got bored and wrote a barrel distortion only shader.

It runs speedily on my original 256Mb model B, which matches a model A+ hardware wise, even on a 1080 screen.

It doesn’t do any sort of anti-aliasing so the results don’t look great by default – it has the image problems I described earlier. Because you are going to be using such a small screen however you can get a substantial image improvement by specifying that the input texture should have linear filtering rather than nearest. These two images show examples of the same screen with nearest and linear filtering.

[attachment file=”106536″]

[attachment file=”106537″]

I’ve tried adding it as an attachment but it doesn’t let me and code tags don’t seem to like it either so here it is – until the #endif:

#if defined(VERTEX)
uniform mediump mat4 MVPMatrix;
attribute mediump vec4 VertexCoord;
attribute mediump vec2 TexCoord;

varying mediump vec2 TEX0;

void main()
{
TEX0 = TexCoord;
gl_Position = MVPMatrix * VertexCoord;
}
#elif defined(FRAGMENT)
uniform sampler2D Texture;
uniform vec2 InputSize;
uniform vec2 TextureSize;
varying vec2 TEX0;

const mediump float BARREL_DISTORTION = 0.25;
const mediump float rescale = 1.0 – (0.25 * BARREL_DISTORTION);

void main()
{
vec2 scale = TextureSize / InputSize;
vec2 tex0 = TEX0 * scale;
vec2 texcoord = tex0 – vec2(0.5);
float rsq = texcoord.x * texcoord.x + texcoord.y * texcoord.y;
texcoord = texcoord + (texcoord * (BARREL_DISTORTION * rsq));
texcoord *= rescale;

if (abs(texcoord.x) > 0.5 || abs(texcoord.y) > 0.5)
gl_FragColor = vec4(0.0);
else
{
texcoord += vec2(0.5);
texcoord /= scale;
vec3 colour = texture2D(Texture, texcoord).rgb;

gl_FragColor = vec4(colour,1.0);
}
}
#endif

davej
Participant
Post count: 21

I’m the guy from the RPi forum who said it could be done.

The not wanting to help bit was mainly because I didn’t know what I would have been signing up to – it may have been hundreds of hours work for all I knew. That’s why I suggested you needed to provide people with more information about your project to interest them.

So, having got that out of the way…

That bit of information saying you only need 320×240 resolution is crucial.

The reason the RaspberryPi has problems running the shaders in retroarch is to do with how fast it’s GPU can access memory. Desktop GPUs, which retroarch shaders have been written for, have very fast memory access but, because accessing memory by GPUs is power hungry, mobile GPUs use a different design approach that minimises the amount of memory accesses they have to do to save power. Explanation why [url=http://blog.imgtec.com/powervr/a-look-at-the-powervr-graphics-architecture-tile-based-rendering]here[/url] if you’re interested in the details.

The multiple pass rendering technique that retroarch uses is pretty much the worst case scenario for mobile GPUs because it needs lots of memory accesses. When people say the Raspberry Pi is too slow running these shaders they are typically considering running them at desktop resolutions.

The fact that you only need such a small resolution makes a very big difference. The display you use takes up about 5% of the memory an emulator running at 1080 would. This means there is a good chance the Raspberry Pi could run some of those shaders at a decent frame rate.

dankcushions’s suggestion of trying just the curvature shader is a good one.

There’s a problem you need to consider when doing barrel distortion on a display that is so close in resolution to the game’s. When the vertical resolution after barrel distortion is less than the game’s vertical resolution, you will lose parts of the game screen. This may cause unacceptable problems – you might lose important bits of the game. You might be able to mitigate this with an anti-aliasing shader but they need lots of memory accesses so you might run into performance problems again.

Viewing 19 posts - 1 through 19 (of 19 total)