francot514
Well-Known Member
Rhys said:It includes fixes for the primitives too, yes.
Ok, thats sound good, also i have tested some Ea land objects, partially working except those that still need npcs..
Rhys said:It includes fixes for the primitives too, yes.
The game assumes that you are mousing over a ground tile for placing objects, similar to how the real game works, but different in that table placed objects actually offset this down a little bit.
Are you talking about using the hand tool for picking objects up? That's just performed by rendering the objects with an Object ID buffer instead of colour/depth, and getting the id at the pixel clicked. It's hilariously inefficient right now though (draws all objects every tick in update... ugh), and will be worked a bit more into the static buffers system once I clean that all up, so that it's simply rendered at the same time as the colour/depth buffers using multiple render targets for minimum cost.At first I thought you meant that when you move the Hand Tool over a tile on the ground, if a surface object exists on that tile, then the game will shift the sprite of the object you're placing away from your true mouse coordinates up onto the surface. If the user is moving an object from one surface onto another of equal height, then he can't tell that his mouse coordinates have been shifted up.
However, I checked this behavior ingame, and that wasn't the case. So when you said "table placed objects actually offset this down a little bit", you meant that they offset the range of screen coordinates for which the object will be placed on that counter upwards away from the ground. (And of course, the sprite of the object will be shifted upwards onto the counter).
So when the user switches to the Hand Tool, you could create the mapping "The screen coordinate (x,y) corresponds to this object (or the ground) at tile (u,v) with z-order distance 'z'" for all (x,y), and then iterate over each of these mappings from farthest z-order distance to nearest; if the object at that coordinate (x,y) is a surface object, you create the new mapping "Screen coordinate (x,y+height) gets mapped to the surface of the surface object on tile (u,v) with z-order distance 'z'" (overwriting the old mapping).
However, since TSO lets you place new objects without pausing the game time, you would have to update the mappings after each SimAntics tick.
Are you talking about using the hand tool for picking objects up?
I had another look at the behaviour in the real game and the anchor position seems to change depending on where you picked it up from. I don't think there's anything fancy going on, just a hittest with the ground with an offset. Try picking up a big object like a fridge from multiple spots (top and bottom), and you'll see what I mean. I'll try and implement this for the next release.I was referring to placing (old or new) objects down on a new "tile" (or a new surface object).
Of course, you pick an object up just by clicking on it. To know what the user is clicking on, of course you just refer to the screen z-buffer to see what object is nearest to the camera at that point.
Rhys said:Everything pretty much working like a charm (even with fade tween!), with a few things left to tie together + object thumbnails on the left.
One definite future problem will be rendering object thumbnails, which is needed both here and for object dialogs.