Quantcast
Channel: Community | MonoGame - Latest topics
Viewing all articles
Browse latest Browse all 6821

What's happening here?

$
0
0

@Katorias wrote:

Hey guys,

I've been playing around with different camera projections etc. and I've come across some behavior that I can't seem to wrap my head around.

I'm rendering some vertices like so,

`

floorVerts[0].Position = new Vector3(-20, -20, 0);
floorVerts[1].Position = new Vector3(-20, 20, 0);
floorVerts[2].Position = new Vector3(20, -20, 0);

floorVerts[3].Position = floorVerts[1].Position;
floorVerts[4].Position = new Vector3(20, 20, 0);
floorVerts[5].Position = floorVerts[2].Position;`

With the following position and look at vectors,

public Vector3 position = new Vector3(0, 0, 30);
public Vector3 lookAtVector = new Vector3(0, 1, 0);

Which when using an Up vector of (0, 0, 1) renders completely fine. However if I set the Y value of the look at vector to anything below zero I'm not seeing the results I would expect, instead it seems the camera flips 180 degrees around the Z axis, almost like it has been spun around.

I understand why I wouldn't see any vertices rendered if the target vector was (0, 0, 0), I'm just confused why the camera wouldn't just keep looking "further back" along the Y axis, but instead seems to suddenly flip. I'm sure I'm missing something completely obvious here :smiley:

Posts: 3

Participants: 3

Read full topic


Viewing all articles
Browse latest Browse all 6821

Trending Articles