All posts by Chris McBain

Initial Search Functionality


The tileset shown previously proved to have a few flaws and has been reworked slightly, but now initial functionality has been added towards implementing the required search algorithm.

Red gizmos represent those within a radius that have been highlighted to be searched, green gizmos represent those that can currently be ‘seen’ by the object. Currently this only represents the primary vision arc, and does not factor in obstacles (but that functionality has been implemented elsewhere).

Using the arc meshes for such detection proved troublesome/impossible as the project is actually in 3D rather than 2D. The work-around was:

// Get the normalised forward vector of the ‘entity’ object

Vector3 sight = forward.normalized;

// Get the normalised direction vector from the entity to this search node

Vector3 distanceVector = mSearchNodes[ i ].transform.position – center;
Vector3 direction = distanceVector.normalized;

// Dot product returns 1 (same direction), to 0 (90 degrees)
float cosine = Vector3.Dot( sight, direction );

// Use arc cos and convert from radians to degrees
float degrees = Mathf.Acos( cosine ) * Mathf.Rad2Deg;

The intended angles were about 40 degrees (primary) to 140 degrees (peripheral)  achieved if the returned value was less than half of either value.

An early version of the area system has also been implemented – checking if a search area is within the radius before doing the calculation above for each search node within. Currently this relies on collider checks (returning obstacles and walls as well), but does cut down on unnecessary checks.

Run Doggies Run!


This was our submission for the 2015 Global Game Jam at Abertay University!

This game was made by Julie Kennedy and myself in under 48 hours. The theme this year was “What Do We Do Now”, which resulted in a fair few co-op games like ours. The object of the game is keep running as fast as possible by jumping over flowers, and removing larger obstacles by getting the right dog in front.

Prototyping Search Nodes for Area Searching


This version of the test environment has been built using a set of tile prefabs – each containing its own search nodes and walls. At this time some manual adjustment of wall placement is still required (deleting many, otherwise moving on a 0.25f grid). Construction of levels should now be much faster, and now allows some automation!


The white spheres represent the grid search node structure (1.0f spacing). The walls have been adjusted to match this levels layout.


I was able to utilize the MavMesh (once baked) to automatically detect any SearchNodes that cannot be reached, and flag them as blocked (black).

The next step will implementing the required search functionality.

Prototyping Tileset with NavMesh


Before commencing further work into the implementation of a suitable method of searching the environment, I decided to build a prototype tileset. The outer loop is built out of many individual tiles – but the NavMesh correctly correctly joins them together (with a few small glitches).

If built in such a way, the tiles would be Unity Prefabs – most importantly containing the required node structure to allow for a grid search. Each ’tile’ or prefab would contain its own container of nodes, therefore allowing a general sweep of each tile before accessing the nodes within.

Current construction does not allow me to fill in the room in the middle correctly – scaling the square tile would be possible, but this would also scale any node structure later included within. I will be looking either to design the tileset a little more carefully, or include a wider range of rectangular sections to fill the gaps.

Hunting Simulation

Source code – Python for Maya.

Following on from the first semesters Technical Art Applications, Scripting & Dynamics was left almost completely open in terms of our submission.

After focusing on the Volibear Rig, rather than the Python Tool – I decided to pursue a scripting heavy coursework.

My chosen area was crowd simulation – specifically I wanted to look at the interaction between herding mechanics and hunting. Ideally I wanted to look at two groups interactions (prey and pack), rather than single predator agents interacting with a single group of prey.

These diagrams are recreated from those shown by Craig Reynolds BOIDS, and show the three basic flocking rules.


I made these diagrams to demonstrate the idea of a pack consisting of centers and wings, of a flanking and driving style of hunting.


Its worth noting that although my proposal was accepted, I was warned about this being an ambitious project.

This blog kept as part of the coursework tells the story of this coursework.

Challenge of the Grid

This was a single semester project to have an initial look at Network Programming.

Source code – sfml 2.0, C++.

The game was created using the sfml 2.0 library – using both the graphics and its own networking libraries.

Using the graphics library was an absolute pleasure, I often remarked if was like being handed our PS2 framework – only fully featured and ready to go.

The networking library also proved pretty decent – matching up directly with what we had been taught with Sockets.

IndieFest 2013

Although many of had attended previous years Dare To Be Digital – this would be the first time exhibiting at IndieFest.


The chosen game was our latest GDS project Hubble Bubble – with two Windows builds at our booth.


This was a great experience overall – and highlighted the importance of the tutorial for Hubble Bubble.


Source code – XAudio2, C++.

Allan Milne’s framework has not been included, though it remains referenced as appropriate.

This coursework was unique in a couple of ways:

firstly although graphical elements were permitted for debugging purposes – they were not allowed to form a part of the final submission.

secondly we were given the opportunity to use the lecturers own XAudio2 framework.

Unsure of quite how to go about making an audio only game, I eventually settled on the idea of using the Aliens motion tracker.

The idea or story was that your motions trackers display was broken – and the player would have to rely on its audio output only. During development I felt that leaving judging direction simply to the stereo left and right channels was too difficult and frustrating. Therefore I decided to exaggerate the player or listeners audio cone, only allowing full volume in the front 90 degrees, reducing to zero volume before reaching the rear 90 degree arc.

As per the movies – the contacts or signals pitch increases as their distance decreases. Finally I added some pulse rifle kill clips, and a game over style death clip. The games only controls were left and right rotation, with the goal being to turn towards the incoming contact before it gets too close.

In practice the game still proved a little hit and miss. As the movement was gradual and rotational – I could not think of any suitable audio to represent movement. Looking back I believe it would of been suitable to even further exaggerate the audio cone by eliminating the inner (full volume) cone altogether – so a user could achieve full volume by aligning directly to a signal.


After a long wait it was finally time – the 3rd Year Group Project.

Our team was assigned to Guerilla Tea’s Titan1um brief – a side scrolling game featuring 3D characters and dynamic object cutting.

The brief was targeting iOS devices specifically – with Unity being the chosen engine.

Aquabear Productions:

David Gunn, Stefan Harrison,  David Hughes, Max Inkster, Kyle Maxwell, myself, Natasha MacDonald, Conor McHugh, Christy McLaughlin, John Robb, Neil Robertson.

Titan1um source – Unity, C#.

I would have to say our group got along really well, really coming together during the second semester to finalize our prototype.

After our code team broke down and implemented a basic form of mesh slicing, I was tasked with translating the screens swipe gestures into the scene. The mesh slicing code required the cut to completely intersect with the object, and the diagram below shows how I approached the problem.


As it turned out I ended up rigging our games main character T1m.

The character and mesh were created by Kyle Maxwell, then passed to be for rigging and skinning. Afterwards I passed the rig to Neil Robertson for animation.


This allowed me to make up for my failed skinning attempt on Volibear. T1m was a robot, but the mesh did not allow use of rigid binding for the skinning. Therefore smooth binding and re-weighting was required – particularly around the joints.


Given T1m’s bulky chest armor, we were concerned about his range of motion. Despite this Neil produced a rather amazing set of animations.