Game Development: 4.User Interface and HUD
In the design of my game engine,
I have split the game’s visualisation into two primary components: the 3D
renderer, and the HUD. Both inherit from my AbstractScreen class, allowing them
to handle input, update and draw. Since these two components are displayed
together in the game, both visualising elements of the game’s state, I have
wrapped them in a GameplayScreen class, also an AbstractScreen, which is added
to the ScreenManager’s screen stack when the game loads. The GameplayScreen also
holds and updates the GameState, which stores everything in the game world,
including the player, enemies, the current level etc. The GameplayScreen passes
the GameState to the renderer and HUD, which draw aspects of the current state,
with the HUD overlaid on the renderer output.
Using the HasFocus, IsActive and
IsVisible controls, the functionality of each component can be controlled
individually. For example, if the player wants to hide the HUD, its IsVisible
and HasFocus properties are set to false to stop it drawing and accepting
input, while IsActive is left as true to keep updating the HUD so that it is in
the correct state when it is made visible again. Additionally, setting these
properties for the GameplayScreen which contains the renderer and HUD components
allow the functionality of both to be controlled at the same time, such as
setting HasFocus and IsActive to false when the pause screen is added to the
ScreenManager, stopping the updating and input handling of both elements at
once, while continuing to draw both the game world and the HUD underneath the
transparent pause screen.
UI Elements
A game’s Heads-Up Display (HUD)
is primarily used to concisely display a portion of the data available in the
game’s state, such as the player’s health, equipped weapon and ammo, the mini-map.
Additionally, it may facilitate some of the gameplay features through interactive
elements, like an inventory and equipment display for a role-playing game, or
unit management in a strategy game. While some games render HUD elements within
the 3D render state, or display 3D elements within the HUD, I am currently
working under the assumption that all HUD elements are 2D and displayed over
the 3D render state.
The basic structure of a UI
element is very similar to that of a screen: it can handle input, update its
logic, and draw its visual components, the use of each controlled by the
HasFocus, IsActive and IsVisible properties, respectively. I’ve added an
additional Vector2 parentPosition input parameter to the UI Element’s Update
method, the use of which I discuss in the Coordinate System section below.
The main GameHUD screen stores a
linked list of type AbstractUIElement, into which any UIElement implementation
may be added. The GameHUD screen loops through all elements in the list in its
HandleInput, Update and Draw methods, checking the corresponding property (i.e.
HasFocus, IsActive, IsVisible), and calling the method on the element if the
property is true. The advantage of using a linked list for UI elements as
opposed to a regular list is that it relatively trivial to reorder elements.
This is especially important during the Draw call, in which elements are drawn
from back to front in the order they appear in the linked list. If a user selects
part of the HUD that is obscured by other UI elements, we can pull it from the linked
list and re-add it at the end, so that it will be drawn last, and therefore on
top of the other elements. Adding an OnClick event to the UIElement and tying
it to a MoveToFront event handler in the GameHUD screen allows us to call this
ordering functionality when the player clicks on a UI element:
public void MoveToFront(UIElement element) { if(_UIElements.Contains(element)) { _UIElements.Remove(element); _UIElements.AddLast(element); } }
The specific implementation of
each UI element will differ based on the information it is displaying or the
functionality it is providing. Generally, each element will display text and
images, so they will hold SpriteFonts and Texture2Ds. As most HUD elements are
communicating information or providing interaction with the game’s state, a
reference to the appropriate object in the game state is passed in the UI
element’s constructor. For example, a health bar HUD element will be passed the
player object, so that when it draws it uses the player object’s health
variable.
Element Hierarchy
Rather than storing all UI
elements in a flat structure directly in the HUD class, I’ve constructed a UI
hierarchy that allows elements to be grouped in containers. Basic
container-type UI components include grid and list views, which can be combined
with other UI elements to form more complex HUD components, such as a
grid-based player inventory or skill bar. Each container specifies the layout
and position of sub-components, managing them with a nested coordinate system.
Coordinate System
With a 2D HUD, we only need to
concern ourselves with X and Y coordinates, with the position values of all HUD
elements stored as Vector2s. Our main HUD screen has the full game window as
its canvas. In MonoGame, the (0, 0) start coordinate of the window is the
top-left corner of the window, while the final (Viewport.Width,
Viewport.Height) coordinate is its bottom-right corner.
Using a hierarchical approach to
storing UI elements, it is important to be able to position a component
relative to its parent. I store two positional values for every element:
_positionRelative, an element’s XY offset from its parent’s current position,
and _positionAbsolute, the actual screen position of the element. Following the
coordinate system of the MonoGame application’s viewport, _positionAbsolute is
the top-left corner of a UI element.
When a UI element is created, its
relative position is set and it is passed its parent’s current absolute
position, calculating its own position as follows:
_positionAbsolute = parentPosition+_positionRelative.
In the Update call for every UI
element, I pass in the latest parentPosition and perform the above calculation,
ensuring that the each element is always correctly positioned relative to its
parent. When drawing the element, all calls to the SpriteBatch will use the
screen position stored in _positionAbsolute.
UI elements also have Width and
Height properties, calculated based on the text/Texture2D/Bounding Box of the
component. Container UI
elements use these properties to position child components relative to each
other, for example setting the _positionRelative of the second element in a
grid row equal to the first element’s offset plus its Width.
Experiment: Floating HUD
I’d been playing Destiny as I was
implementing the HUD, and liked the way the UI elements float around the screen
as the cursor moves: as the cursor moves right, the UI floats left, as it moves
down, the UI moves up, and vice versa. As
a test of my HUD, I decided I’d try implementing the functionality in my code.
Thanks to my hierarchical UI and
relative-position based coordinate system, it turns out that this is a
relatively simple feature to get working. In order to move the whole UI, all we
need to do is change the _positionAbsolute value of the GameHUD screen. All UI
elements that it holds will have their position modified during the update call,
and down through the element hierarchy until all child UI elements’ positions
have been changed.
To implement this functionality,
the code was modified to calculate the change to apply to the HUD screen’s
_positionAbsolute based on the position of the mouse cursor during every update.
Firstly, I add an additional Vector2 _defaultPosition coordinate to the GameHUD
class, which stores the position of the HUD when the mouse is at its “default”
position (i.e. top-left corner, centre of screen, or bottom-right corner). In any
frame, the absolute position of the HUD is its default position plus some
factor of the mouse cursor’s deviation from its default position:
Vector2 mouseMoveOffset = new Vector2((- input.CurrentMouseState.X) / dampeningFactor, ( - input.CurrentMouseState.Y) / dampeningFactor); _positionAbsolute = _defaultPosition + mouseMoveOffset;
The dampeningFactor controls the
ratio of mouse movement to HUD movement, while taking negative values of the
mouse coordinates moves the HUD in the opposite direction of the mouse
movement. This calculation is performed and the _positionAbsolute value passed to every UIElement
during every update, allowing the _positionAbsolute of every element in the UI
hierarchy to be updated before the Draw call.
Subscribe to:
Post Comments
(
Atom
)
No comments :
Post a Comment