Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
So, the Vket preparations were a total disaster, and now I can focus back on making things nicely. A few rushes are still present, though. In random order, Texture2DArray support has been added, with a simple texture array. I still need to develop a nice 2D Array creator and preparation tool. The reason being that I'm amazed by how BAD Blender is when trying to do simple things, like setting the color value of a vertex. Like PUT COLOR RGB(1,0,0) ON SELECTED VERTEX is a very difficult operation on Blender... The whole Vertex Paint being a Joke, since it doesn't exactly apply the color that you want, even if you specifiy the Hexadecimal color code, which should be already "gamma corrected". Still, the shader I prepared have some issues with the normals, and I don't know why, since I have no idea about the normals "format", how the data is sent to the shader, how it's handled, ... To understand that, I 'd have to make my own shader from scratch... In a near future ! That said, you can select a texture to apply to the next tiles now. The whole idea is not to do this, though. The whole idea is to have texture "Slots" (1,2,3,4,5), paints with these slots (think Minesweeper where you paint numbers on the logic part) and these slots will be bound to a speficic index of the Texture2DArray. Another menu would allow you to change the texture set to a slot, so that you can try different textures quickly. That's for the next version, though. The next step is to add "Triplanar" mapping, in order to make the textures easily extend between the tiles, giving a nice look. Meanwhile, reading textures data from Unity, inside an Editor tool, is rather easy. So, I'll try to save the data inside a screenshot, using a dumb way, and then try to make the screenshot fit EXACTLY, the standard camera output (I'll see about different resolutions afterwards...). Also, I dropped the "bazillion" layers idea, and went back to one single layer for Raycasting. while checking for the name of the collider hit. I might also forget about layers actually afterwards, since I just learned that the raycast fire order can be done from a specific collider, which will make sure that the Raycast only targets itself ! Then the logic will be based on the local hit position. I'm also computing the InverseTransformPoint from the transform hit, removing the need to setup every potential transform that could be hit as a parameter to the script (which was stupid). This means that I also dropped the idea of sampling textures coordinates, but since InverseTransformPoint actually ignores the current transform scale, this is almost identical when hitting (scaled) 1x1 quads. Some files were moved, due to VketTools constraints, and I might follow this pattern in the future. Having all the required files in the same folder makes backup easier, actually. Signed-off-by: Voyage <[email protected]>
- Loading branch information