Hi
I was trying again to understand shaders:
|)
my goal would be to apply a normal map shader to my tilebased game.
I have a tilemap which is composed by rgb tiles, with "objects (Sprites - but could be meshes)" on top of it.
I am currently using a basic fog of war made with rendertarget and various "holes" in a black shape to simulate lights.
I'd like to "bump" these objects according to my existing lights. I can provide normal map textures for each object.
My textures come from a texturepack, so I am using atilim's trick to get texture regions
http://giderosmobile.com/forum/discussion/comment/21594#Comment_21594I understood that in normal map example there is a mesh with the "first texture" set here
local mesh = Mesh.new()
mesh:setVertexArray(0, 0, 512, 0, 512, 512, 0, 512)
mesh:setTextureCoordinateArray(0, 0, 512, 0, 512, 512, 0, 512)
mesh:setIndexArray(1, 2, 3, 1, 3, 4)
mesh:setTexture(texture)
mesh:setShader(effect) |
but I am missing a lot of things, starting from the basics:
1) "where" are the coordinates to the "normal map texture"?
2) Is it possible to add more than one light source? How?
Thank you
Comments
I remember I tried to generate a normal map applying some photoshop effects, and it was not great, it was ok, but not great.
like this:
as I'm not a visual guy, it was hard for me to find proper values, etc.
usually as I see, some assets already include normal maps created by the asset creator.
but as I said, should be possible to create with some image editors, depending on your skill level.
also am really interested to see a video demo of what/when you achieve something
There are many other tools to draw normal maps: I was thinking about https://www.codeandweb.com/spriteilluminator or exporting them directly from blender or sketchup
however my problem is not (yet) drawing normal maps textures, but how to "mix them up" using gideros shaders
From what I understood it should work like this:
Texture is the bitmap I need to show
TextureMap is the normal map that should "change" the values of lighting in Texture pixels, according to light position.
Cross.png - which is the demo texture used in the example is a single texture 1024x512 which contains both (Texture and TextureMap) - my texture size is different and I am taking my textures from a texturepack, so it should be a slightly different approach.. however, this is the main reason for my questions
1)"where" are the coordinates to the "normal map texture"?
Reading the example I only see a Mesh constructor, with a texture 512x512px (starting from 0,0) attached.. now.. how does the shader know that it needs to get the TextureMap (and "where" should it take it from? from which coordinates? )
2)Is it possible to add more light sources using the same shader? and how to do it?
thank you
this is where you get the normal map:
mediump vec3 normal = texture2D(g_Texture, texCoord + vec2(0.5, 0.0)).rgb * 2.0 - 1.0;
Notice that texCoord + vec2(0.5, 0.0), that means normal is taken from main coordinate + half of total width.
I'm not sure on how you can load other texture to your main texture with Gideros code.
For my case, i just use renderTarget for it.
2. Yes, by just simply adding it with FOR function.
This probably would help:
https://github.com/mattdesl/lwjgl-basics/wiki/ShaderLesson6
Likes: pie, oleg
Pixel shader is actually function to process color.
And it applied on every pixel in screen.
From normal map example(ps.glsl) This:
lowp vec3 color0 = texture2D(g_Texture, texCoord).rgb;
Would take a color in a current pixel.
Also,
gl_FragColor = g_Color * vec4(color0 * diff + color1 * spec, 1);
This at the end, actually a return for the color with value : vec4(r,g,b,a)
[notice that alpha in here is 1, if you want variable alpha, you might want to do something like this:
lowp vec4 colorF = texture2D(g_Texture, texCoord);
lowp vec3 color0 = colorF.rgb;
.
.
.
gl_FragColor = g_Color * vec4(color0 * diff + color1 * spec, colorF.a);
]
To increase the lighting, you can just add r/g/b.
On how normal mapping works.
Basically it check (something like) distance between the current light position and the normalmap value in the pixel. You can see this from this function here:
lowp float diff = max(0.0, dot(normal, lightDir));
that "dot" function.
This is also why in normal mapping x = red, y = green, z = blue, because it compare x with red and so on.
By comparing, you get a value of distance between light position and normalmap value of that pixel, and this value then work as multiplier of the color : ( color0 * diff ).
color1 at there is just added thing to add something like ilumination, you actually can do simpler function other than that formula.
I hope this help to understand more.
Likes: pie
Likes: pie
I made some progress, here's a screenshot of my test if you're interested.
@hgy29 I hope you find some time to comment out all the shaders examples, this is gold
I tried your suggestion but I wasn't able to make it work out of the box (unless I misunderstood something).
Maybe this is due to how isometricTilemap is composed, if I apply the effect to each single tile mesh it "breaks the image" showing only an offsetted portion of each tile (black corners on top, and missing half bottom tile).
Is there a tech reason why this shader only works with meshes or is it a "bug"?
In the end I made it work, but I feel like I've cheated
I made an additional normalmap-texture tileset in tiled and drew my normal map inside a tiled level.
When I load the tilemap I draw the "normalmap level" aside of it, then rendertarget both (as these were a unique big texture) and then create a mesh to "host" this texture.
However I am not sure that this is the best option, I now have a "bonus texture" loaded, as big as my tilemap x 2, and a new mesh on stage.
What do you think?
@tkhnoman do you render each tile-texture or you're doing something similar?
By the way I believe that this approach has a benefit, normalmaps are independent from the classic-tileset: it's possible to achieve some vfx and there should be no issue when flipping tiles (bumps need to be inverted).
A curious sideeffect (between the other things I ignore, I don't know math very well...) is that I needed to change vector grabbing position to 0.313 instead of 0.5.
Thank you
Likes: antix
I started playing with normals for my 3D first person game demo and faced a few issues with normal map example, which is too simplistic. Among other things:
- Coordinates used for computations are in Sprite local space, which makes it hard to use if several Sprite share the shader
- It assumes that the shape drawn is in 2D (maybe not an issue for you)
- I am still fighting correctly computing the light source position to give to the shader.
But, now that I think about it, the width of a drawn isometricTilemap (which could also be "irregular") is not always a power of 2.. this could be the problem
I need to find the best way to get the next power of 2 in gideros, given my tilemap width*2 as a starting number.
Is there some math-savy trick that you can think of?
Thank you
Likes: antix, pie
But when it is for non moving object, i draw them together in one render target, and draw also normal map on that single rendertarget. There is a weakness though, that is render target can't exceed 2048 x 2048, so i need to maintain the group mesh to be 1024 x 1024 (considering the render target to be 2048 x 1024). So if it exceed, i need to create another group.
Also, sometime when working on rendertarget / mesh, you need to prepare them when the application start, not before you're using them. If you're using them directly after creating, sometime it can show a bug, for example the image is gone, or it become black.
So in my case, i create bunch of them, and using it like pooling.
I didn't notice the renderTarget limit using 128x64 tiles, my texture is 4096 x 964 right now. It fails with tiles@2x on a phone (s2) but it's still working on tablet and laptop. however it's just a simple tilemap, I don't know what will happen when I'll bring it into my game with animations and stuff
Now that my system works with coefficient 0.5 and any shaped tilemap I tried to hardcode some more lights into the normal map shader example to know how many can be handled, but I am getting a bunch of errors:
Here's my ps shader code, I left vs as it is in the example:
I still don't get how to initialize and gather the totalLighting, I suppose I need to sum the result of each light and set gl_FragColor.
Here's my lua constructor:
Thanks
Likes: pie, oleg
Maybe the easiest and fastest way of doing this is to draw everything normally, then have a low resolution (but with filtering turned on) rendertarget that is a fraction of the screen display size. First clear the render target as black, then punched out of it the light positions in white depending on where the light sources in your game will appear on screen (as a fraction). Then draw the render target on the screen scaled up to the full size of the screen (don't forget it was created as a fraction of the screen) - but use the multiply option.
The multiply option will multiply areas that are black in the rendertarget by 0, but let light through on bits that are not black in the rendertarget. You can add hundreds of light sources on screen at once (I use 16 in Dungeons). The filter and fractional screen make the multiply vary so it's nicely shaded.
If you want thousands of light sources, then you can do a variation by using two more rendertargets, make them a fraction of 4 screens (or less) square. Build them up alternately over a number of frames depending on where the scroll position of your map is. Now draw that on your original rendertarget first rather than clearing with black. Swap the new rendertargets as they have been built.
https://deluxepixel.com
Do you have the directx (hlsl?) variation on your code - so we can see what changes would have to be made for compatibility with everything?
https://deluxepixel.com
And no, I don't have the HLSL counterpart, though it shouldn't be hard to translate.
https://deluxepixel.com
@hgy29 thanks a lot for the new shader, I am going to do some tests,
however at first glance it seems much more "unstable" or "heavy" than the single light source (sometimes the map flicker or disappear for some ms)[edit: rebooting windows made it work flawlessly] and when the sources overlap there is too much lightIt is still a bit obscure to me how to change light vector direction: now it comes from down and it's pointing up-left (from a screen pov) changing the 150 changes the light "altitude", but which parameters control its "tilt" and "rotation" factor"? (my maths ignorance comes out again... )
However, you did a great job with shaders:
The biggest improvement I can think about this is the possibility to set a normalmap for each loaded texture (with fallback if none), and let gideros do the thinking about normals if the effect is applied to a sprite containing textured objects with normalmaps. If you think that it's possible to do it, I will open a formal request on github
Thank you again
Mine is modified much, i didn't use dot function, but distance instead.
I also change blue parameter not as z, but as how strong object reflect light, that way i can differentiate between metal / stone.
Just wondering, which device that flicker appear at?
I don't have any flicker (at low spec PC) at 16 source of light, with much more calculation.
It should be become lag if it to heavy, i did heavy calculation before at another shader function and it drop the frame rate, not flicker like this.
Did you create mesh/render and use it right away (not preparing it at the first run)?
If yes, probably its better to try create them first at first run.
Also, if you want "not to much light if light overlap", you need to modify the code yourself (like giving the maximum IF value).
Changing direction of the light would be advanced one, you need to do much more calculation for that, and not just changing a variable. The lights should have more parameter, not just how strong it is, but also something like light rotation and light reaching distance.
But creating shadow, i think, would be heavy calculation if done in shader. Probably better to seperate the shadow calculation like how SinisterSoft said.
The flickering is gone after rebooting my laptop, I suppose something went wrong playing with shaders on my system. Sorry for the false alarm
I agree with you, I like a lot shader customization (though I am still not being able to do much on my own).
What I was speaking about was the possibility to use shaders directly on a Sprite (as I would do with gtween) without using rendertarget and a mesh any time the sprite changes. I realize that this probably means that gideros has to manage the same things by its own.. However if this means to "make normal map unchangeable" of course it's not worth it.
Back to my experiments, I managed to set light sources on/off sending a parameter in a Shader.CINT uniform, but I am unable to set parameters sending in an array. I think I am missing something about data types:
this is the relevant code:
lua call
working.
lua call
no errors in output window but black screen
lua call
ERROR: 0:38: 'expression' : left of '[' is not of type array, matrix, or vector
ERROR: 0:41: 'expression' : left of '[' is not of type array, matrix, or vector
What is the correct syntax to do it?
Is vec2 wrong since my xy are "unrelated"?
Thank you
You're mixing data types and array sizes.
If you use "mediump vec2 xxx[n]", in your second example,then your data type is FLOAT, with 2*n values, while you passed int values to the shader from lua, and only two values (not 4).
If you use "mediump int xxx[n]", in your thirs example, then your data type is INT, with n values, not a double dimension array.
Sorry I don't have much time to elaborate tonight, hope this shed some light.
Likes: pie
I would like to send 3 parameters to shader in realtime, in addition to coordinates, and I saw that I can use method 1 from above (creating a uniform variable for each parameter I want to set).
Is there a more elegant (and maybe performing) way to do the same thing? if I could change everything in one call I think it would be better than calling 3 times setConstant.
Thank you
.
Thanks for the snippet, however if there is no difference from doing setConstant multiple times, it seems better to stick with the old method. At least it is easier to read
basically i need an example of how to have a lighted 3d scene (with one light for now).
optimally, i'd like to see a new gideros example project which does the following:
having a 3d object (preferably a random polygonal terrain we see from above)
and a light source (above the terrain) which follows the cursor movement.
of course it would be silly to request such a thing, and if you give enough help i may try to do this on my own (which you can add to gideros examples if it works out well), but so far i have no knowledge about shaders at all, so i don't even know where to start.
of course generating the terrain i can probably do, so i just need to know how to handle a light source.
thanks a lot
Fragmenter - animated loop machine and IKONOMIKON - the memory game
Fragmenter - animated loop machine and IKONOMIKON - the memory game
can you tell me what those lines do?
next i will try to cut out all the code i don't need, given that my mesh is already generated, so i only need to add normals. btw it's interesting that although the mesh face is already in 3d you can set its normal different from orthogonal to the face? what's the rationaly behind this?
all in all that's enough for me to work out the rest, if i manage to put it into my app Fragmenter so that it performs and looks well etc. then i will let you know. probably it will come later as there are many other more basic things that have to be done with it.
Fragmenter - animated loop machine and IKONOMIKON - the memory game
Then those per vertex normals are interpolated for each pixel of the face, which the impression that the surface is curved when lit.
I suppose a 3D guy would explain this better than me
Likes: pie
Fragmenter - animated loop machine and IKONOMIKON - the memory game